In recent years, deep neural networks have made substantial progress in object recognition. However, one issue with deep learning is that it is currently unclear which proposed framework is exaggerated for a specific hitch. As a result, distinct dispositions are attempt before one that produces satisfactory results is discovered. This paper described a distributed supervised learning method for finding the best network architecture by modifying specifications for a perceived task dynamically. In the case of the MNIST information gathering, it is shown that asynchronous supervised learning can agree on a solution space. Setting several hyperparameters can be time-consuming when constructing neural networks. In this post, we'll provide you with some tips and instructions for better organizing your hyperparameter tuning process, which should help you find a good setting for the hyperparameters much faster.