@compact is great and I'm looking forward to more development/moving it out of experimental. 🙏
#540
Replies: 3 comments 1 reply
-
|
Can you elaborate on what you mean by the last part (maybe an example_? |
Beta Was this translation helpful? Give feedback.
-
|
For sure. In Flax you can either explicitly define the layers in Or define them more implicitly in the actual call: Currently Lux works like: but could potentially work like: I'm not the best at meta programming so I'm not sure about the feasibility of this but having the option to keep the layer definitions and control flow of the inputs in the same area can be nice. |
Beta Was this translation helpful? Give feedback.
-
|
#584 moves compact out of experimental! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Thanks for developing
@compactit makes writing neural nets so much easier. This is partially because I'm coming from Python frameworks which are more similar to that but also because it introduces so much flexibility and readability.To me this is a huge improvement from defining models only with
Chain. WhileChainis great and gives a lot of improved usability over PyTorch and Flax'snn.Sequentialit is still limiting to use on more complex models.As it currently stands, it's totally fine but what do you think about trying to move the layer definition into the
doblock like Flax's@nn.compact?Thanks again 🤗
Beta Was this translation helpful? Give feedback.
All reactions