@@ -85,10 +85,13 @@ MMI.training_losses(embedder::EntityEmbedder, report) =
85
85
86
86
In MLJ (or MLJBase) bind an instance unsupervised `model` to data with
87
87
88
- mach = machine(model , X, y)
88
+ mach = machine(embed_model , X, y)
89
89
90
90
Here:
91
91
92
+ - `embed_model` is an instance of `EntityEmbedder`, which wraps a supervised MLJFlux model.
93
+ The supervised model must be one of these: `MLJFlux.NeuralNetworkClassifier`, `NeuralNetworkBinaryClassifier`,
94
+ `MLJFlux.NeuralNetworkRegressor`,`MLJFlux.MultitargetNeuralNetworkRegressor`.
92
95
93
96
- `X` is any table of input features supported by the model being wrapped. Features to be transformed must
94
97
have element scitype `Multiclass` or `OrderedFactor`. Use `schema(X)` to
@@ -129,25 +132,42 @@ X = (;
129
132
repeat(["group1", "group1", "group2", "group2", "group3"], Int(N / 5)),
130
133
),
131
134
)
132
- y = categorical([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]) # Classification
135
+ y = categorical(repeat(["class1", "class2", "class3", "class4", "class5"], Int(N / 5)))
133
136
134
- # Initiate model
135
- EntityEmbedder = @load EntityEmbedder pkg=MLJFlux
137
+ # Load the entity embedder, it's neural network backbone and the SVC which inherently supports
138
+ # only continuous features
139
+ EntityEmbedder = @load EntityEmbedder pkg=MLJFlux
136
140
NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux
141
+ SVC = @load SVC pkg=LIBSVM
137
142
138
- clf = NeuralNetworkClassifier(embedding_dims=Dict(:Column2 => 2, :Column3 => 2))
139
143
140
- emb = EntityEmbedder(clf)
144
+ emb = EntityEmbedder(NeuralNetworkClassifier(embedding_dims=Dict(:Column2 => 2, :Column3 => 2)))
145
+ clf = SVC(cost = 1.0)
146
+
147
+ pipeline = emb |> clf
141
148
142
149
# Construct machine
143
- mach = machine(emb , X, y)
150
+ mach = machine(pipeline , X, y)
144
151
145
152
# Train model
146
153
fit!(mach)
147
154
155
+ # Predict
156
+ yhat = predict(mach, X)
157
+
148
158
# Transform data using model to encode categorical columns
149
- Xnew = transform(mach, X)
150
- Xnew
159
+ machy = machine(emb, X, y)
160
+ fit!(machy)
161
+ julia> Xnew = transform(machy, X)
162
+ (Column1 = Float32[1.0, 2.0, 3.0, … ],
163
+ Column2_1 = Float32[1.2, 0.08, -0.09, -0.2, 0.94, 1.2, … ],
164
+ Column2_2 = Float32[-0.87, -0.34, -0.8, 1.6, 0.75, -0.87, …],
165
+ Column3_1 = Float32[-0.0, 1.56, -0.48, -0.9, -0.9, -0.0, …],
166
+ Column3_2 = Float32[-1.0, 1.1, -1.54, 0.2, 0.2, -1.0, … ],
167
+ Column4 = Float32[1.0, 2.0, 3.0, 4.0, 5.0, 1.0, … ],
168
+ Column5 = Float32[0.27, 0.12, -0.60, 1.5, -0.6, -0.123, … ],
169
+ Column6_1 = Float32[-0.99, -0.99, 0.8, 0.8, 0.34, -0.99, … ],
170
+ Column6_2 = Float32[-1.00, -1.0, 0.19, 0.19, 1.7, -1.00, … ])
151
171
```
152
172
153
173
See also
0 commit comments