fix(inductive): stabilize labels handling for graph runs#29
fix(inductive): stabilize labels handling for graph runs#29melvinbarbaux merged 1 commit intomainfrom
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 0e08495d52
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| - id: labels.encode | ||
| - id: labels.to_torch | ||
| params: | ||
| device: "auto" | ||
| dtype: int64 |
There was a problem hiding this comment.
Avoid converting raw string labels to torch
The newly added labels.to_torch step consumes raw.y (see modssc/preprocess/steps/labels/to_torch.py) and overwrites the encoded labels from labels.encode. That means any dataset whose raw labels are non-numeric (e.g., OpenML classification labels that come back as strings/objects) will now fail preprocessing with torch.as_tensor(raw.y) instead of using the encoded integer labels, so these runs won’t start. This regression only shows up when raw.y isn’t already numeric; consider converting labels.y to torch instead, or ensure labels are numeric before invoking labels.to_torch in these configs.
Useful? React with 👍 / 👎.
Summary
What does this PR change?
Checklist
Notes
Anything reviewers should know?