This is a package dedicated to performing a least squares constrained
optimization on a linear objective function. The functions minimize the
same objective function as lm
, applying a constraint on the
beta parameters:
$$S(\beta) = \sum_{i=1}^m \vert y_i - \sum_{j=1}^nX_{ij}\beta_j \vert^2 = \Vert y - X\beta\Vert^2$$
And
β̂ = argβmin S(β) under the constraints:
lower ≤ β̂ ≤ upper
The idea behind the package is to give the users a way to perform a
constrained “linear regression” in an easy and intuitive way. The
functions require a formula in the same syntax and format as
lm
which is a style most R users are familiar with.
So far the package includes two functions in order to perform the constrained optimization:
colf_nls
- uses the port algorithm which comes from the
stats::nls
function.colf_nlxb
- uses Nash’s variant of Marquardt nonlinear
least squares solution which comes from the nlsr::nlxb
function.You can find more details about the two algorithms if you have a look
at ?nls
and ?nlxb
respectively.
Now we will see how we can easily use the port algorithm to perform a
constrained optimization. As you will see we are using
colf_nls
in the same way we would use lm
with
the addition of upper and lower bounds for our parameter estimates.
We will use the mtcars
data set for a demonstration.
Let’s load the package and use mtcars
to run a constrained
least squares optimization model.
In the model below we use 4 variables to model mpg which means we
will have 5 parameter estimates (don’t forget the Intercept). Parameters
are prefixed with param_
in the model’s output. We set the
lower bounds of those 4 parameter estimates to -2 and the upper bounds
to 2 (obviously they do not need to be the same). Ideally, starting
values should be provided. If omitted a cheap guess will be made, which
is basically setting all starting values to 1. If the staring values do
not fall within the boundaries defined by lower and upper then an error
will be returned and you would need to manually change the starting
values via the start
argument.
## Loading required package: nlsr
mymod <- colf_nls(mpg ~ cyl + disp + hp + qsec, mtcars, lower = rep(-2, 5), upper = rep(2, 5))
mymod
## Nonlinear regression model
## model: mpg ~ param_X.Intercept. * X.Intercept. + param_cyl * cyl + param_disp * disp + param_hp * hp + param_qsec * qsec
## data: model_ingredients$model_data
## param_X.Intercept. param_cyl param_disp param_hp
## 2.00000 0.23936 -0.03868 0.01033
## param_qsec
## 1.33915
## residual sum-of-squares: 417.6
##
## Algorithm "port", convergence message: relative convergence (4)
As you can see all 5 parameter estimates fall within the defined boundaries. The above provided formula includes the Intercept. In the output, X.Intercept is a variable set to 1 and param_X.Intercept is the estimated intercept.
If starting values do not fall within the boundaries an error will be returned. As said previously if not provided they will be set to 1.
## Error in nls(model_ingredients$model_formula, data = model_ingredients$model_data, : Convergence failure: initial par violates constraints
So, then they need to be set by the user:
colf_nls(mpg ~ cyl + disp + hp + qsec, mtcars, lower = rep(-2, 5), upper = rep(0.5, 5),
start = rep(0, 5))
## Nonlinear regression model
## model: mpg ~ param_X.Intercept. * X.Intercept. + param_cyl * cyl + param_disp * disp + param_hp * hp + param_qsec * qsec
## data: model_ingredients$model_data
## param_X.Intercept. param_cyl param_disp param_hp
## 0.50000 0.50000 -0.02539 0.06971
## param_qsec
## 0.50000
## residual sum-of-squares: 2238
##
## Algorithm "port", convergence message: relative convergence (4)
As with lm
, colf_nls
accepts the same kind
of formula syntax:
## Nonlinear regression model
## model: mpg ~ param_hp * hp + param_cyl * cyl
## data: model_ingredients$model_data
## param_hp param_cyl
## -0.1075 5.4036
## residual sum-of-squares: 3150
##
## Algorithm "port", convergence message: both X-convergence and relative convergence (5)
## Nonlinear regression model
## model: mpg ~ param_X.Intercept. * X.Intercept. + param_cyl * cyl + param_disp * disp + param_hp * hp + param_drat * drat + param_wt * wt + param_qsec * qsec + param_vs * vs + param_am * am + param_gear * gear + param_carb * carb
## data: model_ingredients$model_data
## param_X.Intercept. param_cyl param_disp param_hp
## 12.30338 -0.11144 0.01334 -0.02148
## param_drat param_wt param_qsec param_vs
## 0.78711 -3.71530 0.82104 0.31776
## param_am param_gear param_carb
## 2.52023 0.65541 -0.19942
## residual sum-of-squares: 147.5
##
## Algorithm "port", convergence message: relative convergence (4)
## Nonlinear regression model
## model: mpg ~ param_X.Intercept. * X.Intercept. + param_I.hp...cyl. * I.hp...cyl.
## data: model_ingredients$model_data
## param_X.Intercept. param_I.hp...cyl.
## 30.36670 -0.06722
## residual sum-of-squares: 438.6
##
## Algorithm "port", convergence message: both X-convergence and relative convergence (5)
## Nonlinear regression model
## model: mpg ~ param_X.Intercept. * X.Intercept. + param_hp * hp + param_cyl * cyl + param_disp * disp + param_hp.cyl * hp.cyl + param_hp.disp * hp.disp + param_cyl.disp * cyl.disp + param_hp.cyl.disp * hp.cyl.disp
## data: model_ingredients$model_data
## param_X.Intercept. param_hp param_cyl param_disp
## 9.290e+01 -4.703e-01 -1.059e+01 -3.865e-01
## param_hp.cyl param_hp.disp param_cyl.disp param_hp.cyl.disp
## 6.734e-02 2.808e-03 5.270e-02 -3.841e-04
## residual sum-of-squares: 153.9
##
## Algorithm "port", convergence message: relative convergence (4)
## Nonlinear regression model
## model: mpg ~ param_X.Intercept. * X.Intercept. + param_hp.cyl * hp.cyl
## data: model_ingredients$model_data
## param_X.Intercept. param_hp.cyl
## 27.26649 -0.00713
## residual sum-of-squares: 407.2
##
## Algorithm "port", convergence message: both X-convergence and relative convergence (5)
## Nonlinear regression model
## model: mpg ~ param_X.Intercept. * X.Intercept. + param_hp * hp + param_cyl * cyl + param_hp.cyl * hp.cyl
## data: model_ingredients$model_data
## param_X.Intercept. param_hp param_cyl param_hp.cyl
## 50.75121 -0.17068 -4.11914 0.01974
## residual sum-of-squares: 247.6
##
## Algorithm "port", convergence message: relative convergence (4)
Notice that when the above versions are used, the parameter names are
created with the use of make.names
in order to be
syntactically valid (otherwise the optimizers fail). This is why you see
an ‘X.’ in front of the intercept or too many dots in the names.
colf
provides a number of methods for colf
objects:
predict
- uses parameter estimates to predict on a new
data setcoef
- retrieve the coefficientsresid
- retrieve the residualsprint
- print the modelsummary
- view a summary of the modelfitted
- retrieve the fitted valuesIn order to use the parameter estimates to make predictions on a new data set you need to remember two really important checks:
If any of the two is not valid, predict
will fail.
set.seed(10)
newdata <- data.frame(hp = mtcars$hp, cyl = mtcars$cyl, disp = mtcars$disp, qsec = mtcars$qsec)
predict(mymod, newdata)
## 1 2 3 4 5 6 7 8
## 20.42650 21.17642 24.66251 20.62689 14.59128 22.89609 13.73410 24.70696
## 9 10 11 12 13 14 15 16
## 29.15953 22.73086 23.53435 18.40833 18.67616 19.21182 11.85499 12.20813
## 17 18 19 20 21 22 23 24
## 12.60092 26.66851 25.36775 27.52795 26.11065 15.75659 16.87389 13.54502
## 25 26 27 28 29 30 31 32
## 13.08441 25.89359 21.60836 23.07806 12.48397 20.39242 15.28503 24.31159
But if I change any of the names or classes predict
will
fail
## Error in eval(predvars, data, env): object 'hp' not found
#change column class
newdata2 <- newdata
newdata2$cyl <- as.character(newdata2$cyl)
predict(mymod, newdata2)
## Error in value[[3L]](cond): newdata column classes need to be the same as original data
The rest of the colf_nls
methods are demonstrated
below:
You need to be careful when using summary
because it
returns p-values. By default nls
and nlxb
both
return p-values for the coefficients, which were naturally passed on to
colf. When running an unconstrained regression the p-values show us how
likely it is for the estimate to be zero. In constrained regression
though this may not even hold if you think that a restriction (and
actually a common one) is to force the coefficients to be positive. In
such a case the hypothesis test does not hold at all since we have
restricted the coefficients to be positive. In constrained regression
other assumptions that we make in unconstrained regression do not hold
either (like the coefficients’ distribution) so the use and
interpretation of the p-values can be problematic when we set lower
and/or upper.
##
## Formula: mpg ~ param_X.Intercept. * X.Intercept. + param_cyl * cyl + param_disp *
## disp + param_hp * hp + param_qsec * qsec
##
## Parameters:
## Estimate Std. Error t value Pr(>|t|)
## param_X.Intercept. 2.00000 14.03972 0.142 0.8878
## param_cyl 0.23936 1.08431 0.221 0.8269
## param_disp -0.03868 0.01480 -2.614 0.0145 *
## param_hp 0.01033 0.02281 0.453 0.6543
## param_qsec 1.33915 0.61867 2.165 0.0394 *
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 3.933 on 27 degrees of freedom
##
## Algorithm "port", convergence message: relative convergence (4)
## param_X.Intercept. param_cyl param_disp param_hp
## 2.00000000 0.23936254 -0.03867616 0.01032871
## param_qsec
## 1.33914631
## Nonlinear regression model
## model: mpg ~ param_X.Intercept. * X.Intercept. + param_cyl * cyl + param_disp * disp + param_hp * hp + param_qsec * qsec
## data: model_ingredients$model_data
## param_X.Intercept. param_cyl param_disp param_hp
## 2.00000 0.23936 -0.03868 0.01033
## param_qsec
## 1.33915
## residual sum-of-squares: 417.6
##
## Algorithm "port", convergence message: relative convergence (4)
## [1] 0.5735036 -0.1764183 -1.8625081 0.7731113 4.1087223 -4.7960925
## [7] 0.5659050 -0.3069639 -6.3595251 -3.5308601 -5.7343479 -2.0083296
## [13] -1.3761588 -4.0118174 -1.4549896 -1.8081272 2.0990774 5.7314898
## [19] 5.0322524 6.3720468 -4.6106462 -0.2565866 -1.6738858 -0.2450237
## [25] 6.1155943 1.4064061 4.3916356 7.3219353 3.3160300 -0.6924247
## [31] -0.2850313 -2.9115860
## attr(,"label")
## [1] "Residuals"
## [1] 20.42650 21.17642 24.66251 20.62689 14.59128 22.89609 13.73410 24.70696
## [9] 29.15953 22.73086 23.53435 18.40833 18.67616 19.21182 11.85499 12.20813
## [17] 12.60092 26.66851 25.36775 27.52795 26.11065 15.75659 16.87389 13.54502
## [25] 13.08441 25.89359 21.60836 23.07806 12.48397 20.39242 15.28503 24.31159
## attr(,"label")
## [1] "Fitted values"
colf_nlxb
can be used in the exact same way as
colf_nls
. All aspects / features discussed about
colf_nls
do stand for colf_nlxb
as well. Only
the underlying algorithm changes.
mymod <- colf_nlxb(mpg ~ cyl + disp + hp + qsec, mtcars, lower = rep(-2, 5), upper = rep(2, 5))
mymod
## nlsr class object: x
## residual sumsquares = 417.6 on 32 observations
## after 17 Jacobian and 17 function evaluations
## name coeff SEs tstat pval gradient JSingval
## 1 param_X.Intercept. 2.000000 NA NA NA 0.0000e+00 1723.9765
## 2 param_cyl 0.239363 NA NA NA 1.1132e-07 225.9009
## 3 param_disp -0.038676 NA NA NA -9.8498e-10 48.7775
## 4 param_hp 0.010329 NA NA NA -1.3531e-09 4.1741
## 5 param_qsec 1.339146 NA NA NA -1.4715e-08 0.0000
Setting lower, upper and starting values:
#start values are outside boundaries
colf_nlxb(mpg ~ cyl + disp + hp + qsec, mtcars, lower = rep(-2, 5), upper = rep(0.5, 5))
## Error in nlxb(model_ingredients$model_formula, data = model_ingredients$model_data, : Infeasible start
#so they need to be provided
colf_nlxb(mpg ~ cyl + disp + hp + qsec, mtcars, lower = rep(-5, 5), upper = rep(.5, 5),
start = rep(0, 5))
## nlsr class object: x
## residual sumsquares = 2237.9 on 32 observations
## after 5 Jacobian and 5 function evaluations
## name coeff SEs tstat pval gradient JSingval
## 1 param_X.Intercept. 0.500000 NA NA NA 0.0000e+00 1.7213e+03
## 2 param_cyl 0.500000 NA NA NA 0.0000e+00 2.2559e+02
## 3 param_disp -0.025392 NA NA NA 3.8793e-08 3.9200e-14
## 4 param_hp 0.069710 NA NA NA -4.3715e-07 0.0000e+00
## 5 param_qsec 0.500000 NA NA NA 0.0000e+00 0.0000e+00
lm
:## nlsr class object: x
## residual sumsquares = 3149.5 on 32 observations
## after 3 Jacobian and 3 function evaluations
## name coeff SEs tstat pval gradient JSingval
## 1 param_hp -0.10747 0.045383 -2.3680 2.4529e-02 8.6402e-12 914.0685
## 2 param_cyl 5.40364 1.139220 4.7433 4.8055e-05 -2.1983e-10 8.9873
## nlsr class object: x
## residual sumsquares = 147.49 on 32 observations
## after 4 Jacobian and 4 function evaluations
## name coeff SEs tstat pval gradient
## 1 param_X.Intercept. 12.303374 18.717884 0.65731 0.518124 -1.7917e-09
## 2 param_cyl -0.111440 1.045023 -0.10664 0.916087 6.7267e-11
## 3 param_disp 0.013335 0.017858 0.74676 0.463489 -2.2737e-12
## 4 param_hp -0.021482 0.021769 -0.98684 0.334955 -1.0232e-12
## 5 param_drat 0.787111 1.635373 0.48130 0.635278 6.6166e-11
## 6 param_wt -3.715304 1.894414 -1.96119 0.063252 -1.6046e-11
## 7 param_qsec 0.821041 0.730845 1.12341 0.273941 5.3006e-11
## 8 param_vs 0.317763 2.104509 0.15099 0.881423 -1.8691e-11
## 9 param_am 2.520227 2.056651 1.22540 0.233990 4.5514e-11
## 10 param_gear 0.655413 1.493260 0.43891 0.665206 5.7923e-11
## 11 param_carb -0.199419 0.828752 -0.24063 0.812179 -9.4147e-12
## JSingval
## 1 1724.34152
## 2 226.12682
## 3 50.89759
## 4 5.65107
## 5 4.27912
## 6 4.05009
## 7 1.62429
## 8 1.40497
## 9 1.23732
## 10 1.11377
## 11 0.14119
## nlsr class object: x
## residual sumsquares = 438.6 on 32 observations
## after 3 Jacobian and 3 function evaluations
## name coeff SEs tstat pval gradient JSingval
## 1 param_X.Intercept. 30.366699 1.6439622 18.4717 0.000e+00 -2.1718e-08 948.7026
## 2 param_I.hp...cyl. -0.067219 0.0098026 -6.8572 1.307e-07 1.1596e-10 2.3258
## nlsr class object: x
## residual sumsquares = 153.85 on 32 observations
## after 4 Jacobian and 4 function evaluations
## name coeff SEs tstat pval gradient
## 1 param_X.Intercept. 9.2899e+01 2.7046e+01 3.4348 0.0021642 -7.9286e-08
## 2 param_hp -4.7032e-01 2.5860e-01 -1.8187 0.0814593 6.7007e-10
## 3 param_cyl -1.0595e+01 4.9353e+00 -2.1468 0.0421264 1.3996e-08
## 4 param_disp -3.8649e-01 1.9265e-01 -2.0061 0.0562395 4.5793e-10
## 5 param_hp.cyl 6.7338e-02 3.8482e-02 1.7499 0.0929203 -5.3296e-10
## 6 param_hp.disp 2.8082e-03 2.0286e-03 1.3843 0.1790009 -1.7462e-08
## 7 param_cyl.disp 5.2702e-02 2.7608e-02 1.9089 0.0682965 -5.7480e-10
## 8 param_hp.cyl.disp -3.8412e-04 2.6774e-04 -1.4347 0.1642846 -1.3970e-07
## JSingval
## 1 2.3660e+06
## 2 2.1312e+04
## 3 2.2941e+03
## 4 1.2322e+03
## 5 5.9652e+01
## 6 3.5301e+01
## 7 1.9948e+00
## 8 9.2185e-02
## nlsr class object: x
## residual sumsquares = 407.2 on 32 observations
## after 3 Jacobian and 3 function evaluations
## name coeff SEs tstat pval gradient
## 1 param_X.Intercept. 27.2664949 1.1817113 23.0737 0.0000e+00 -1.0810e-08
## 2 param_hp.cyl -0.0071304 0.0009798 -7.2774 4.2026e-08 2.0009e-11
## JSingval
## 1 6822.6123
## 2 3.1177
## nlsr class object: x
## residual sumsquares = 247.6 on 32 observations
## after 3 Jacobian and 3 function evaluations
## name coeff SEs tstat pval gradient
## 1 param_X.Intercept. 50.751202 6.5116856 7.7939 1.7242e-08 -9.6764e-07
## 2 param_hp -0.170680 0.0691016 -2.4700 1.9870e-02 9.2869e-09
## 3 param_cyl -4.119139 0.9882291 -4.1682 2.6725e-04 1.3703e-07
## 4 param_hp.cyl 0.019737 0.0088109 2.2401 3.3202e-02 -1.2824e-09
## JSingval
## 1 6881.82249
## 2 155.24157
## 3 8.31320
## 4 0.45214
set.seed(10)
newdata <- data.frame(hp = mtcars$hp, cyl = mtcars$cyl, disp = mtcars$disp, qsec = mtcars$qsec)
predict(mymod, newdata)
## 1 2 3 4 5 6 7 8 9 10 11
## 20.426 21.176 24.663 20.627 14.591 22.896 13.734 24.707 29.160 22.731 23.534
## 12 13 14 15 16 17 18 19 20 21 22
## 18.408 18.676 19.212 11.855 12.208 12.601 26.669 25.368 27.528 26.111 15.757
## 23 24 25 26 27 28 29 30 31 32
## 16.874 13.545 13.084 25.894 21.608 23.078 12.484 20.392 15.285 24.312
As with colf_nls
, in colf_nlxb
keeping
names and classes the same is vital:
## Error in eval(predvars, data, env): object 'hp' not found
#change column class
newdata2 <- newdata
newdata2$cyl <- as.character(newdata2$cyl)
predict(mymod, newdata2)
## Error in value[[3L]](cond): newdata column classes need to be the same as original data
Rest of methods provided:
Please make sure you read the section about the interpretation of the
p-values at colf_nls
when running a constrained regression.
The same principles described there hold for colf_nlxb
.
## $resname
## [1] "mymod"
##
## $ssquares
## [1] 417.6
##
## $nobs
## [1] 32
##
## $coeff
## param_X.Intercept. param_cyl param_disp param_hp
## 2.000000 0.239363 -0.038676 0.010329
## param_qsec
## 1.339146
##
## $ct
## [1] "U" " " " " " " " "
##
## $mt
## [1] " " " " " " " " " "
##
## $SEs
## [1] NA NA NA NA NA
##
## $tstat
## [1] NA NA NA NA NA
##
## $pval
## [1] NA NA NA NA NA
##
## $Sd
## [1] 1723.9765 225.9009 48.7775 4.1741 0.0000
##
## $gr
## [,1]
## param_X.Intercept. 0.0000e+00
## param_cyl 1.1132e-07
## param_disp -9.8498e-10
## param_hp -1.3531e-09
## param_qsec -1.4715e-08
##
## $jeval
## [1] 17
##
## $feval
## [1] 17
##
## $data_frame_to_print
## name coeff SEs tstat pval gradient JSingval
## 1 param_X.Intercept. 2.000000 NA NA NA 0.0000e+00 1723.9765
## 2 param_cyl 0.239363 NA NA NA 1.1132e-07 225.9009
## 3 param_disp -0.038676 NA NA NA -9.8498e-10 48.7775
## 4 param_hp 0.010329 NA NA NA -1.3531e-09 4.1741
## 5 param_qsec 1.339146 NA NA NA -1.4715e-08 0.0000
## param_X.Intercept. param_cyl param_disp param_hp
## 2.000000 0.239363 -0.038676 0.010329
## param_qsec
## 1.339146
## nlsr class object: x
## residual sumsquares = 417.6 on 32 observations
## after 17 Jacobian and 17 function evaluations
## name coeff SEs tstat pval gradient JSingval
## 1 param_X.Intercept. 2.000000 NA NA NA 0.0000e+00 1723.9765
## 2 param_cyl 0.239363 NA NA NA 1.1132e-07 225.9009
## 3 param_disp -0.038676 NA NA NA -9.8498e-10 48.7775
## 4 param_hp 0.010329 NA NA NA -1.3531e-09 4.1741
## 5 param_qsec 1.339146 NA NA NA -1.4715e-08 0.0000
## Mazda RX4 Mazda RX4 Wag Datsun 710 Hornet 4 Drive
## -0.57350 0.17642 1.86251 -0.77311
## Hornet Sportabout Valiant Duster 360 Merc 240D
## -4.10872 4.79609 -0.56591 0.30696
## Merc 230 Merc 280 Merc 280C Merc 450SE
## 6.35952 3.53086 5.73435 2.00833
## Merc 450SL Merc 450SLC Cadillac Fleetwood Lincoln Continental
## 1.37616 4.01182 1.45499 1.80813
## Chrysler Imperial Fiat 128 Honda Civic Toyota Corolla
## -2.09908 -5.73149 -5.03225 -6.37205
## Toyota Corona Dodge Challenger AMC Javelin Camaro Z28
## 4.61065 0.25659 1.67389 0.24502
## Pontiac Firebird Fiat X1-9 Porsche 914-2 Lotus Europa
## -6.11559 -1.40641 -4.39164 -7.32194
## Ford Pantera L Ferrari Dino Maserati Bora Volvo 142E
## -3.31603 0.69242 0.28503 2.91159
## attr(,"gradient")
## param_X.Intercept. param_cyl param_disp param_hp param_qsec
## [1,] 1 6 160.0 110 16.46
## [2,] 1 6 160.0 110 17.02
## [3,] 1 4 108.0 93 18.61
## [4,] 1 6 258.0 110 19.44
## [5,] 1 8 360.0 175 17.02
## [6,] 1 6 225.0 105 20.22
## [7,] 1 8 360.0 245 15.84
## [8,] 1 4 146.7 62 20.00
## [9,] 1 4 140.8 95 22.90
## [10,] 1 6 167.6 123 18.30
## [11,] 1 6 167.6 123 18.90
## [12,] 1 8 275.8 180 17.40
## [13,] 1 8 275.8 180 17.60
## [14,] 1 8 275.8 180 18.00
## [15,] 1 8 472.0 205 17.98
## [16,] 1 8 460.0 215 17.82
## [17,] 1 8 440.0 230 17.42
## [18,] 1 4 78.7 66 19.47
## [19,] 1 4 75.7 52 18.52
## [20,] 1 4 71.1 65 19.90
## [21,] 1 4 120.1 97 20.01
## [22,] 1 8 318.0 150 16.87
## [23,] 1 8 304.0 150 17.30
## [24,] 1 8 350.0 245 15.41
## [25,] 1 8 400.0 175 17.05
## [26,] 1 4 79.0 66 18.90
## [27,] 1 4 120.3 91 16.70
## [28,] 1 4 95.1 113 16.90
## [29,] 1 8 351.0 264 14.50
## [30,] 1 6 145.0 175 15.50
## [31,] 1 8 301.0 335 14.60
## [32,] 1 4 121.0 109 18.60
## Mazda RX4 Mazda RX4 Wag Datsun 710 Hornet 4 Drive
## 20.426 21.176 24.663 20.627
## Hornet Sportabout Valiant Duster 360 Merc 240D
## 14.591 22.896 13.734 24.707
## Merc 230 Merc 280 Merc 280C Merc 450SE
## 29.160 22.731 23.534 18.408
## Merc 450SL Merc 450SLC Cadillac Fleetwood Lincoln Continental
## 18.676 19.212 11.855 12.208
## Chrysler Imperial Fiat 128 Honda Civic Toyota Corolla
## 12.601 26.669 25.368 27.528
## Toyota Corona Dodge Challenger AMC Javelin Camaro Z28
## 26.111 15.757 16.874 13.545
## Pontiac Firebird Fiat X1-9 Porsche 914-2 Lotus Europa
## 13.084 25.894 21.608 23.078
## Ford Pantera L Ferrari Dino Maserati Bora Volvo 142E
## 12.484 20.392 15.285 24.312