You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Refactor regression code and docs (for #109) (#170)
* refactored regression code and docs
* added int to float data conversion, and methods for vector-type regressors
* added isotonic regression
* added docs for regression
Copy file name to clipboardexpand all lines: docs/src/index.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ end
11
11
[MultivariateStats.jl](https://github.com/JuliaStats/MultivariateStats.jl) is a Julia package for multivariate statistical analysis. It provides a rich set of useful analysis techniques, such as PCA, CCA, LDA, ICA, etc.
@@ -17,8 +17,23 @@ _vaug(X::AbstractMatrix{T}) where T = vcat(X, ones(T, 1, size(X,2)))::Matrix{T}
17
17
_haug(X::AbstractMatrix{T}) where T =hcat(X, ones(T, size(X,1), 1))::Matrix{T}
18
18
19
19
20
-
## linear least square
20
+
## Linear Least Square Regression
21
21
22
+
23
+
"""
24
+
llsq(X, y; ...)
25
+
26
+
Solve the linear least square problem.
27
+
28
+
Here, `y` can be either a vector, or a matrix where each column is a response vector.
29
+
30
+
This function accepts two keyword arguments:
31
+
32
+
- `dims`: whether input observations are stored as rows (`1`) or columns (`2`). (default is `1`)
33
+
- `bias`: whether to include the bias term `b`. (default is `true`)
34
+
35
+
The function results the solution `a`. In particular, when `y` is a vector (matrix), `a` is also a vector (matrix). If `bias` is true, then the returned array is augmented as `[a; b]`.
dims::Union{Integer,Nothing}=nothing) where {T<:Real}
@@ -37,9 +52,32 @@ function llsq(X::AbstractMatrix{T}, Y::AbstractVecOrMat{T};
37
52
end
38
53
_ridge(X, Y, zero(T), dims ==2, bias)
39
54
end
55
+
llsq(x::AbstractVector{T}, y::AbstractVector{T}; kwargs...) where {T<:Real} =
56
+
llsq(x[:,:], y; dims=1, kwargs...)
57
+
58
+
## Ridge Regression (Tikhonov regularization)
59
+
60
+
"""
61
+
62
+
ridge(X, y, r; ...)
63
+
64
+
Solve the ridge regression problem.
65
+
66
+
Here, ``y`` can be either a vector, or a matrix where each column is a response vector.
67
+
68
+
The argument `r` gives the quadratic regularization matrix ``Q``, which can be in either of the following forms:
69
+
70
+
- `r` is a real scalar, then ``Q`` is considered to be `r * eye(n)`, where `n` is the dimension of `a`.
71
+
- `r` is a real vector, then ``Q`` is considered to be `diagm(r)`.
72
+
- `r` is a real symmetric matrix, then ``Q`` is simply considered to be `r`.
40
73
41
-
## ridge regression
74
+
This function accepts two keyword arguments:
42
75
76
+
- `dims`: whether input observations are stored as rows (`1`) or columns (`2`). (default is `1`)
77
+
- `bias`: whether to include the bias term `b`. (default is `true`)
78
+
79
+
The function results the solution `a`. In particular, when `y` is a vector (matrix), `a` is also a vector (matrix). If `bias` is true, then the returned array is augmented as `[a; b]`.
0 commit comments