
Faith's Ratchet snowflake
thingiverse
The problem is trying to predict a target output that ranges from -0.05 to 0.3 for any given input in the shape of (2,). However, the output prediction ranges from around 0.15 to 0.65, and there are also many predictions less than 0. This might happen if your training dataset does not cover this specific range or area of data space well enough. To overcome this problem: 1\. You can try collecting more training data or you can create a dataset where there is some variation over all of the output ranges to avoid it. 2\. However, for the most optimal result and to reduce the impact of missing values during training and when creating your neural network, ensure that all predictions lie in the range [-0.05 0.3]. * The error may be occurring because of a variety of factors, including the fact that it has already converged on an approximate solution but you still have so many epochs and iterations to go. It could also be due to overfitting or your data not fully being normalized for training. The error seems like it could likely be related to an exploding or vanishing gradients problem that's caused by sigmoid outputs of hidden layers or the weights used. To check, look at both sigmoid weights in output neurons and all inputs when training starts. You have multiple possibilities: 1\. Try increasing `l2_weight_decay`, so long as you're confident about not using more parameters and therefore the added weight being good rather than causing your training process to take longer. * Use some data normalization: For any model which requires input with specific shapes or is really big. Normalize to make this possible. 2\. Set weights to have positive initial bias by giving the final layers output of a nonlinearity to set all these sigmoid neurons equal. * Change the activation function at some point if necessary, like at end and for outputs where exploding isn't desired but overfitting is so just get all data that should have different outputs closer in training 3\. Normalize weights before feeding it to your neural network model as well, if this helps. One possibility could be you using something that requires the use of some weights and then when a loss or accuracy metric occurs your training starts going faster for less amount of steps but then it seems more slow which results into overfitting at this stage so maybe consider making adjustments before. Try implementing regularization like L1, dropout etc. There may be some other factor causing that also. I recommend rechecking the activation functions and weights. You might want to normalize all input variables if your training has large difference. Consider checking that data and see why they aren't behaving properly
With this file you will be able to print Faith's Ratchet snowflake with your 3D printer. Click on the button and save the file on your computer to work, edit or customize your design. You can also find more 3D designs for printers on Faith's Ratchet snowflake.