OLS Estimate of Work Conflict and Family Conflict
1. Consider the following simple regression equation
y = β0 + βx + ,
where x, y and are n × 1 vectors, β0 and β are scalars.
1.1 Show that
βˆ =sxysxx, (1)
where sxy is a sample covariance between x and y and sxx is a sample variance of x. Hint: formulate the sum of squared errors (SSE) optimization problem and find
respective optimizers, βˆ0 and βˆ. Explain each step clearly.
2 Let rxy be the correlation between x and y. Express the relationship between rxy and βˆ. Explain in which case they are equal.
3 Write down an expression for cos θ, where θ is an angle between x and y, using the law of cosines. Explain each step clearly. Compare your result with rxy. Interpret
your findings.
2. One hundred and ninety (190) working adults completed a survey in which the following were measured: level of conflict with fellow workers (work conflict; WC),
level of conflict with family members (family conflict; FC), level of conflict between role as a worker and role as a family member (inter-role conflict; IC), work satisfaction
(WS), family satisfaction (FS) and life satisfaction (LS). The attached SAS code will allow you to create a SAS dataset (called CONSTAT) that contains the correlations between these six measures.
2.1 Using SAS/IML language compute the covariance matrix S by definition as in (3.38) of the textbook.
2.2 Using SAS/IML language compute the inverse covariance matrix. Let Pn×n = S−1,then we call P a precision matrix. Is the precision matrix defined? What can you say about the rank of S?
2.3 What is the relationship between work conflict and family conflict given interrole conflict is fixed? What is the relationship between work satisfaction and family satisfaction given life satisfaction is held constant? Hint: run regression models and compute first-order partial correlations. Interpret their significance levels.
2.4 Let S1 be a covariance matrix of WC, FC and IC. Similarly, let S2 be a covariance matrix of WS, FS and LS. Determine the rank of S1 and S2 by definition. Do S1
and S2 have unique inverses?
2.5 Compute their respective precision matrices using Gauss-Jordan Elimination. For S1 and S2 compute the following:
α = −
p12
√p11√p22
. (2)
Show your work.
2.6 Compare α with your results in 2.3. Interpret your findings.
2.7 Which of the two relationships changed more after accounting for a third variable?Why? What are the practical implications of these results?
3. Let???????z1 = W S + F S + LSz2 = W S + F Sz3 = W S.
3.1 Find z¯ and Sz using (3.62) and (3.64).
3.2 Compare the variances of z1, z2 and z3 with the respective variances of the relevant original variables (ex., z3 with WS). Do the results meet your expectations? Explain your conclusions.
3.3 Explain when the result V ar[X + Y ] = V ar[X] + V ar[Y ] (3) 3 holds. Make the required assumptions on X and Y and prove (3). Hint: use the definition of the variance.
where, x, y and ε are n×1 vectors, β0 and β are scalars.
To prove :
This has been proved using OLSE which fix a data in such a way where Residual Sum of Squares (RSS) is minimum.
RSS=
In order to find the value of β0 and the value of has to be minimum, for which first order derivation of RSS with respect to β0 and has to be performed. The derivation is as follows:
This leads to,
, and
Taking derivative of this further results into
From these equations, two equations occurs such as:
……………(1)
……………..(2)
Here, equation (1) and (2) are normal equations. Solving to these two equation will give β0 and values.
Equation (1) can be calculated further as shown below:
This calculated value of β0 has been further used to calculate the value of in equation (2).
Thus the value of has been achieved as desired.
In order to find the relationship of correlation and the following equation has been used.
Where, is standard deviation for y, and, is standard deviation for x.
From the equation it can be said that the value of would be equal to , if xi and yi are such as =.
Write down an expression for cos θ, where θ is an angle between x and y, using the law of cosines. Explain each step clearly. Compare your result with rxy. Interpret your findings
The cosine rule which is used to show angle between two n-dimensional vectors and is as shown below,
The measure of distance from the origin can be presented using an n-dimensional vector which is .
Following the similar way is a measure of the magnitude by which a random variable deviates from its mean. ……Consider this as (3)
In case of discrete random variables,
Furthermore, from previous section it is clear that
which further can be written as based on (3).
This expression is similar to that of . While comparing the values of each vector of is equal to in ; vector in is equal to in . can be seen as , and can be seen as .
.1 Using SAS/IML language compute the covariance matrix S by definition as in (3.38) of the textbook.
The covariance matrix computed using SA/IML is as shown in below table.
Cov matrix S |
||||||
WC |
FC |
IC |
WS |
FS |
LS |
|
WC |
0.2859367 |
0.2015267 |
0.1586767 |
-0.263473 |
-0.189393 |
-0.262987 |
FC |
0.2015267 |
0.3448667 |
0.1879667 |
-0.242573 |
-0.313293 |
-0.293947 |
IC |
0.1586767 |
0.1879667 |
0.2437767 |
-0.174813 |
-0.201833 |
-0.236767 |
WS |
-0.263473 |
-0.242573 |
-0.174813 |
0.2999867 |
0.2027067 |
0.2724933 |
FS |
-0.189393 |
-0.313293 |
-0.201833 |
0.2027067 |
0.3197467 |
0.2641333 |
LS |
-0.262987 |
-0.293947 |
-0.236767 |
0.2724933 |
0.2641333 |
0.3472667 |
Inverse Cov matrix P as S-1 |
||||||
WC |
FC |
IC |
WS |
FS |
LS |
|
WC |
1.3643869 |
-0.242122 |
-0.279302 |
0.4512048 |
-0.193688 |
0.1412677 |
FC |
-0.242122 |
1.611781 |
-0.183561 |
0.049706 |
0.7425796 |
0.1332994 |
IC |
-0.279302 |
-0.183561 |
1.2238563 |
-0.188901 |
0.0751249 |
0.2860141 |
WS |
0.4512048 |
0.049706 |
-0.188901 |
1.5702381 |
-0.150868 |
-0.645472 |
FS |
-0.193688 |
0.7425796 |
0.0751249 |
-0.150868 |
1.6462092 |
-0.424015 |
LS |
0.1412677 |
0.1332994 |
0.2860141 |
-0.645472 |
-0.424015 |
1.7314349 |
As the covariance matrix is invertible, it is said to be defined.
The table shown below shows that the rank of matrix S is 18.
tol |
rankSVD |
4.627E-16 |
18 |
Calculating the Cosine of the Angle between X and Y
In order to find the relationship between work conflict and family conflict regression analysis has been run while keeping the variable interrole fixed. Here, work conflict is a dependent variable, on which the influence of independent variable family conflict has to be analyzed.
Number of Observations Read |
6 |
Number of Observations Used |
6 |
The total number of observations are 6 in the model. The table shown below shows that the independent variables work conflict explains 41% variation in the value of dependent variable work conflict, which is depicted by the value obtained of R2 i.e 0.41 in the regression model.
Root MSE |
0.45849 |
R-Square |
0.4119 |
Dependent Mean |
0.11167 |
Adj R-Sq |
0.2648 |
Coeff Var |
410.59033 |
In addition to the model, the ANOVA table given below shows that the variable family conflict does not have any significant impact on work conflict, when the variable interole is kept constant. This inference of no significant impact has been measured using the significance value p=0.1695 which is greater than α 0.10.
Parameter Estimates |
|||||
Variable |
DF |
Parameter |
Standard |
t Value |
Pr > |t| |
Intercept |
1 |
0.07855 |
0.18822 |
0.42 |
0.6979 |
FC |
1 |
0.58436 |
0.34916 |
1.67 |
0.1695 |
IC |
1 |
0 |
0 |
. |
. |
RESTRICT |
-1 |
0.24418 |
0.38542 |
0.63 |
0.6035* |
Further, the next regression model has been run to determine the relationship between work satisfaction and family satisfaction, keeping the variable life satisfaction constant. Here, work satisfaction is dependent variable and family satisfaction is an independent variable.
Here, the R2 value=0.42 shows that the independent variable family satisfaction explains 42% variation in work satisfaction.
Root MSE |
0.46298 |
R-Square |
0.4284 |
Dependent Mean |
0.16667 |
Adj R-Sq |
0.2855 |
Coeff Var |
277.78664 |
In addition to the model, the table given below shows that family satisfaction has no significant impact on work satisfaction as the p-value=0.1584 is greater than the standard acceptance level α=0.10.
Parameter Estimates |
|||||
Variable |
DF |
Parameter |
Standard |
t Value |
Pr > |t| |
Intercept |
1 |
0.08003 |
0.19552 |
0.41 |
0.7033 |
FS |
1 |
0.63396 |
0.36616 |
1.73 |
0.1584 |
LS |
1 |
1.11022E-16 |
0 |
Infty |
<.0001 |
RESTRICT |
-1 |
0.52522 |
0.37193 |
1.41 |
0.1826* |
The covariance matrix of S1 is as given below.
S1 |
|||
WC |
FC |
IC |
|
WC |
0.2859367 |
0.2015267 |
0.1586767 |
FC |
0.2015267 |
0.3448667 |
0.1879667 |
IC |
0.1586767 |
0.1879667 |
0.2437767 |
The next table shows that the covariance matrix S1 has a rank 9.
tol |
rankSVD |
2.297E-16 |
9 |
The covariance matrix of S2 is as given below.
S2 |
|||
WS |
FS |
LS |
|
WS |
0.2999867 |
0.2027067 |
0.2724933 |
FS |
0.2027067 |
0.3197467 |
0.2641333 |
LS |
0.2724933 |
0.2641333 |
0.3472667 |
The next table shows that the covariance matrix S2 has also rank 9.
tol |
rankSVD |
2.313E-16 |
9 |
Comparison between inverse covariance of both S1 and S2:
Inverse Covariance of S1 |
|||
WC |
FC |
IC |
|
WC |
6.6095291 |
-2.617501 |
-2.283956 |
FC |
-2.617501 |
6.0382586 |
-2.952108 |
IC |
-2.283956 |
-2.952108 |
7.86502 |
Inverse Covariance of S2 |
|||
WS |
FS |
LS |
|
WS |
11.628985 |
0.4455668 |
-9.463937 |
FS |
0.4455668 |
8.4313862 |
-6.762596 |
LS |
-9.463937 |
-6.762596 |
15.449472 |
While comparing the inverse covariance matrix S1 and S2 it has been analyzed that both gives unique matrix as the values for each element is different.
2.5 Compute their respective precision matrices using Gauss-Jordan Elimination. For S1 and S2 compute the following: α = − p12 √p11√p22 . (2) Show your work.
For covariance matrix S1 the calculated is 0.641. This value has been calculated taking the value of p12, p11, and p22 from covariance matrix S1 using the following formula in MS-Excel
(0.2015267)/(SQRT(0.2859367)*SQRT(0.3448667))
Similarly, the value of α from S2 has been calculated based on the p12, p11, and p22 .i.e -0.654507, which has been calculated from (0.2027067)/(SQRT(0.2999867)*SQRT(0.3197467)) in Excel.
If the value of α depicting standard significance level 0.10 gets increased for first regression model from 0.10 to 0641, the variable family conflict will show significant impact on work conflict as the p-value is lesser than this new calculated standard significance level 0.41465. The value of β coefficient shows that 1 unit increase in work conflict brings 0.58 unit significant effect on work conflict as the p-value 0.16 is less than α=0.4146. On the other hand, in other regression model, the new value of α gets decreased to value -0.65from 0.10, this shows that family satisfaction does not show any significant effect on work satisfaction as the p-value is greater than computed α-value -0.045447.
While adding the third variable in first regression model resulted into decreasing the value of adjusted R2 which dropped down to 0.1181 from .2648. On the other hand, adding the independent variable life satisfaction in second regression model resulted into increasing the value of adjusted R2 which increased to 0.52 from 0.41. However, none of the model showed any significant impact of any independent variables on dependent variables.
Root MSE |
0.50216 |
R-Square |
0.4709 |
Dependent Mean |
0.11167 |
Adj R-Sq |
0.1181 |
Coeff Var |
449.69241 |
Root MSE |
0.37858 |
R-Square |
0.7133 |
Dependent Mean |
0.16667 |
Adj R-Sq |
0.5222 |
Coeff Var |
227.14581 |
Let z1 = W S + F S + LS
z2 = W S + F S
z3 = W S
3.1 Find and .
Variable |
Mean |
Std Dev |
Z1 |
0.4566667 |
1.5638627 |
Here, the mean and standard deviation of Z has been calculated along with calculating the mean value of z1, z2, and z3. Looking at the standard deviation results it has been inferred that the members of Z group differs a lot from the mean value 0.927 of group as the standard deviation 3.074 is quite high than its mean.
Compare the variances of z1, z2 and z3 with the respective variances of the relevant original variables (ex., z3 with WS). Do the results meet your expectations? Explain your conclusions.
Variable |
Variance |
Z1 |
2.4456667 |
While comparing the variance of Z1 with Family satisfaction it has been determined that the spread of Z1 2.44 is comparatively much higher than family satisfaction that is 0.32 approx. Similarly, the variance of Z2= W S + F S is also much higher than variance of life satisfaction that is 0.347. However, there is no difference among the variance of Z3 and work satisfaction that is 0.2999 for both.
Var[X + Y ] = Var[X] + Var[Y ]
Answer: The time when X and Y both are independent variable, and uncorrelated with each other, this condition occurs.
To export a reference to this article please select a referencing stye below:
My Assignment Help. (2020). The Essay On Work Conflict And Family Conflict Explores Their Relationship.. Retrieved from https://myassignmenthelp.com/free-samples/mban6400-regression-analysis-correlations-and-dependence.
"The Essay On Work Conflict And Family Conflict Explores Their Relationship.." My Assignment Help, 2020, https://myassignmenthelp.com/free-samples/mban6400-regression-analysis-correlations-and-dependence.
My Assignment Help (2020) The Essay On Work Conflict And Family Conflict Explores Their Relationship. [Online]. Available from: https://myassignmenthelp.com/free-samples/mban6400-regression-analysis-correlations-and-dependence
[Accessed 22 November 2024].
My Assignment Help. 'The Essay On Work Conflict And Family Conflict Explores Their Relationship.' (My Assignment Help, 2020) <https://myassignmenthelp.com/free-samples/mban6400-regression-analysis-correlations-and-dependence> accessed 22 November 2024.
My Assignment Help. The Essay On Work Conflict And Family Conflict Explores Their Relationship. [Internet]. My Assignment Help. 2020 [cited 22 November 2024]. Available from: https://myassignmenthelp.com/free-samples/mban6400-regression-analysis-correlations-and-dependence.