ADVANCED MANAGEMENT ACCOUNTING Lecture 2 - PowerPoint PPT Presentation

1 / 52
About This Presentation
Title:

ADVANCED MANAGEMENT ACCOUNTING Lecture 2

Description:

Variable cost: a cost that varies, in total, in direct proportion ... Curvilinear -Curved line best describes the relationship between X and Y. Dr Owolabi Bakre ... – PowerPoint PPT presentation

Number of Views:1799
Avg rating:3.0/5.0
Slides: 53
Provided by: Unive58
Category:

less

Transcript and Presenter's Notes

Title: ADVANCED MANAGEMENT ACCOUNTING Lecture 2


1
ADVANCED MANAGEMENT ACCOUNTINGLecture (2)
  • Cost Estimation and Behaviour II

2
Last Lecture Summary
  • Cost classifications for predicting cost
    behaviour (i.e. how a certain cost will behave in
    response to a change in activity)
  • Variable cost a cost that varies, in total, in
    direct proportion to changes in the level of
    activity.
  • Fixed cost a cost that remains constant, in
    total, regardless of changes in the level of
    activity.
  • Mixed cost a cost that contains both variable
    and fixed cost elements.

3
Last Lecture Summary (cont)
  • How does management go about actually estimating
    the fixed and variable components of a mixed
    cost? There are five methods
  • The account analysis (inspection of the
    accounts),
  • The engineering approach,
  • The high and low method,
  • The scatter graph (graphical) method, and
  • The least-squares regression method (todays
    topic).

4
The least-squares regression method
  • This method determines mathematically the
    regression line of best fit (i.e. it uses
    mathematical formulas to fit the regression
    line).
  • It is a more objective and precise approach to
    estimating the regression line than the scatter
    graph method (the later fits the regression line
    by visual inspection).
  • Unlike the high-low method, the least-squares
    regression method takes all of the data into
    account when estimating the cost formula.

5
Definition of Terms
  • A regression equation (a regression line when
    plotted on a graph) identifies an estimated
    relationship between a dependent (i.e. cost Y)
    and one or more independent variables (i.e. an
    activity measure or cost driver) based on past
    observations.
  • Simple regression when the regression equation
    includes a dependent variable and only one
    independent variable.
  • Multiple regression when the regression equation
    includes a dependent variable and two or more
    independent variables.

6
Definition of Terms (cont)
  • Types of Relationships between dependent and
    independent variables
  • ??Direct vs. Inverse
  • Direct -X and Y increase together
  • Inverse -X and Y have opposite directions
  • ??Linear vs. Curvilinear
  • Linear -Straight line best describes the
    relationship between X and Y
  • Curvilinear -Curved line best describes the
    relationship between X and Y

7
Possible Relationships Between X and Y in Scatter
Diagrams
8
Simple Linear Regression
  • Simple -only one independent or predictor
    variable (X)
  • Linear -the mathematical relation between X and Y
    is in the form
  • Y a bX

9
Simple Linear Regression Equation
10
Linear Equations
11
Estimating the Linear Equation Using the
Least-Squares Method (LSM)
  • Looks at differences between actual values (Y)
    and predicted values (Y). Best fit tries to
    make these small
  • But positive differences offset negative
  • LSRM minimizes the sum of the squared differences
    (or errors)

12
Least Squares Method Graphically
13
Coefficient Equations
14
Computation Table
15
Computation Table (cont)
  • Where
  • X the level of activity (independent variable)
  • Y the total mixed cost (dependent variable)
  • a the total fixed cost (the vertical intercept
    of the line)
  • b the variable cost per unit of activity (the
    slope of the line)
  • n number of observations
  • S sum across all n observations

16
See Drury (2004)
  • Example on Page (1044) -
  • Exhibit 24.1 Figure 24.3

17
The Example from Drury (2004)
18
Solution
19
Also, See Seal et al. (2006)
  • Example on Page (183) -
  • Solution on Pages (pp. 193 194)

20
The Example from Seal et al. (2006)
21
Solution
22
Test of Reliability
  • To see how reliable potential cost drivers (e.g.
    machine hours, direct labour hours, units of
    output, or number of production runs) are in
    predicting the dependent variable (the total
    mixed cost), three tests of reliability can be
    applied
  • The coefficient of determination,
  • The standard error of the estimate, and
  • The standard error of the coefficient

23
The Coefficient of Determination (r2)
  • The R-Square is a general measure of the
    usefulness of the regression model. It measures
    the extent, or strength, of the association
    between two variables (X,Y).
  • It indicates how much of the fluctuation in the
    dependent variable is produced by its
    relationship with the independent variable (s).
  • An R-Square of 1.00 indicates that 100 of the
    variation in the dependent variable is explained
    by the independent variable (s).
  • Conversely, an R-Square of 0.0 indicates that
    none of the variation in the dependent variable
    is explained by the independent variable (s).

24
r2--Perfect CorrelationAn Example (r21)
25
r2--No CorrelationAn Example (r20)
26
r2 Computation
27
r2-Example -Drury (2004), P. 1044 Solution,
Appendix 24.1
28
Coefficient of Correlation (r)
29
Various r Values
30
r-Example -Drury (2004), P. 1044 Solution,
Appendix 24.1
31
Standard Error of Estimate (se)
  • r2 gives us an indication of the reliability of
    the estimate of total cost but it does not give
    us an indication of the absolute size of the
    probable deviations from the regression line.
  • This is important because the least-squares line
    is calculated from sample data and other samples
    would probably result in different estimates.
  • se measures the reliability of the regression
    line.
  • It measures the variability, or scatter of the
    observed values around the regression line.

32
Scatter Around the Regression Line
33
Formula to Compute se
34
Se -Example -Drury (2004), P. 1044 Solution,
Appendix 24.1
35
The standard error of the coefficient (s )
36
Sb-Example -Drury (2004), P. 1044 Solution,
Appendix 24.1
37
Computer Programs to perform a simple regression
analysis
  • SPSS and Performing A Simple Regression Analysis
  • Microsoft Excel and Performing A Simple
    Regression Analysis

38
Multiple Linear Regression
  • The simple least-squares regression analysis is
    based on the assumption that total cost was
    determined by one activity-based variable only
    (only one factor is taken into consideration).
  • However, other variables besides activity are
    likely to influence total cost.
  • E.g. shipping costs may depend on both the number
    of units shipped and the weight of the units.
  • In a situation such as this, multiple regression
    is necessary (where several factors are
    considered in combination).

39
Multiple Linear Regression (cont)
  • If two independent variables (e.g. machine hours
    and temperature) influence the total cost (e.g.
    the cost of steam generation) and the
    relationship is assumed to be linear, the
    regression equation will be
  • ya b1x1 b2x2
  • where
  • a -represents the total fixed cost.
  • b1 represents the regression coefficient for
    machine hours (i.e. the average change in y
    resulting from a unit change in x1, assuming that
    x2 remains constant).
  • X1 is the number of machine hours.
  • b2 is the regression coefficient for temperature
    (i.e. the average change in y resulting from a
    unit change in x2, assuming that x1remains
    constant).
  • X2 represents the number of days per month in
    which the temperature is less than 15ÂșC.

40
Multicollinearity problem
  • Multiple regression analysis is based on the
    assumption that the independent variables are not
    correlated with each other.
  • When the independent variables are highly
    correlated with each other, it is very difficult
    to separate the effects of each of these
    variables on the dependent variable.
  • This condition is called multicollinearity.
  • Generally, a coefficient of correlation between
    independent variables greater than 0.70 indicates
    multicollinearity.

41
Non-linear regression (the learning-curve-effect)
  • Changes in the efficiency of the labour force may
    render past information unsuitable for predicting
    future labour costs.
  • A situation like this may occur when workers
    become more familiar with the tasks that they
    perform, so that less labour time is required for
    the production of each unit.
  • This phenomenon is known as the
    learning-curve-effect.

42
Non-linear regression (the learning-curve-effect)
cont
  • The learning curve can be expressed in equation
    form as follows
  • Yx axb
  • Where
  • Yxthe cumulative average time required to
    produce X units.
  • a the time required to produce the first unit of
    output.
  • X the number of units of output under
    consideration.
  • The exponent b is defined as the ratio of the
    logarithm of the learning curve improvement (e.g.
    80) divided by the logarithm of 2.

43
Example An application of the 80 learning curve
  • The labour hours are required on a sequence of
    six orders where the cumulative number of units
    is doubled for each order.
  • If the first unit of output was completed on the
    first order in 2000 hours.
  • Required
  • calculate the cumulative average time (per unit)
    taken to produce 2, 4, 8, 16 32 units
    respectively, assuming that the average time per
    unit were 80 of the average time per unit of the
    previous cumulative production.

44
Solution
45
Solution (cont)
  • The cumulative average time (per unit) taken to
    produce 2, 4, 8, 16 32 units
  • First determine
  • Order1 Y1 2000 hours
  • Order 2 -Y2 2000 2-0.322 1600 hours
  • Order 3 -Y4 2000 4-0.322 1280 hours
  • Order 4 -Y8 2000 8-0.322 1024 hours
  • Order 5 -Y162000 16-0.322 819 hours
  • Order 6 -Y322000 32-0.322 655 hours

46
Solution (cont)
47
Solution (cont)Graphical method
48
Factors to be considered when using past data to
estimate cost functions
  • The cost data and activity should be related to
    the same period (e.g. some costs lag behind the
    associated activity wages paid).
  • Number of observations (a sufficient number of
    observations must be obtained).
  • Accounting policies (do not lead to distorted
    cost functions the allocated costs).
  • Adjustments for past changes (any changes of
    circumstances in the future).
  • Relevant range (i.e. the range of activity within
    which a particular straight line provide a
    reasonable approximation to the real underlying
    cost function) and non-linear cost functions (see
    next two slides)

49
The Linearity Assumption and the Relevant Range
50
Fixed Costs and Relevant Range
51
Summary
  • The least-squares regression method is an
    objective and precise approach to estimating a
    cost function based on the analysis of past data.
    The stages involved in the estimation are
  • 1) Select the dependent variable (y) to be
    predicted,
  • 2) Select the potential cost drivers (Xs),
  • 3) Collect data on the dependent variable and
    cost drivers,
  • 4) Plot the observations on a graph,
  • 5) Estimate the cost function, and
  • 6) Test the reliability of the cost function.

52
Workshop (2)
  • See Exercises P5-15 P5-16 (Seal et al., 2006)
Write a Comment
User Comments (0)
About PowerShow.com