![]() After multiple iterations, we chose the Adam optimizer because we observed that it showed slightly better results and converged faster than SGD.Īfter running the model on 25000 images per class for about 3 hrs, we obtained a score of 0.76 on Kaggle’s Public Leaderboard - not a bad result for merely 25000 images per class! To put this into perspective, the average class in the dataset contains 150000 images. After referring to the literature and taking advice from experts on various Kaggle forums, we decided to compare the performance of the Adam optimizer and the SGD optimizer on our data. Selecting an OptimizerĪ key step before proceeding with training is deciding which optimizer to use. We found that accuracy tends to plateau after 100 trees, so we used n_estimators = 100 as our final model, returning an accuracy of 0.8291.Īs we were still deciding on the best model to proceed further with the analysis, we used limited data at the initial step to lessen execution time. We utilized GridSearchCV to cross-validate the model and optimize parameters. We first began with a Random Forest classifier. Since the entire dataset included over 345 categories, we eventually went with a small subset that only contains the following 5 classes: airplane, alarm clock, ambulance, angel and animal migration. This data has already been preprocessed and rendered into a 28x28 grayscale bitmap in numpy. We pulled numpy files of the data from Google Cloud Storage. Building ModelsĪfter completing all the data collection and preprocessing steps, it was time to get started on the most interesting part of the project - model building!īefore we dove into CNN, we tried some basic classifiers to compare different machine learning algorithms and familiarize ourselves with the data. Notably, it has robust potential in OCR (Optical Character Recognition), ASR (Automatic Speech Recognition) & NLP (Natural Language Processing), and reveals lovely insights into how people around the world are different, yet the same.īoth the greyscale/color encoding and image augmentation used OpenCV and ImageGenerator from keras, which loads batches of raw data from csv files and transforms them into images. Since the release of 50 million drawings in the dataset, the ML community has already begun exploring applications of the data in improving handwriting recognition, training the Sketch RNN model to teach machines to draw, and more. You can find more information on the game here or play the game yourself! It prompts the player to doodle an image in a certain category, and while the player is drawing, the neural network guesses what the image depicts in a human-to-computer game of Pictionary. In 2016, Google released an online game titled “Quick, Draw!” - an AI experiment that has educated the public on neural networks and built an enormous dataset of over a billion drawings. To see the full code used, find our github: I. This project was built by Akhilesh Reddy, Vincent Kuo, Kirti Pande, Tiffany Sung, and Helena Shi. Walk with us through this journey to see how we have tackled the challenge of successfully classifying what is “arguably the world’s cutest research dataset!” “The accompanying score gave the short film the momentum and gravitas that brought the whole project together."Ĭheck out scene-by-scene GIFs of today’s Trans-Siberian Railway Doodle below.In this blog post, we describe our process understanding, fitting models on, and finding a fun application of the Google Quick, Draw! dataset. Waltz” as performed by the Moscow Virtuosi Chamber Orchestra. ![]() The Doodle also features Tchaikovsky’s “Serenade for Strings in C Major, Op. ![]() “I felt compelled to echo the visual strength of Russian graphics coupled with a folk art style,” Matt explains. Doodler Matt Cruickshank relied on his first-hand experience traveling the Trans-Siberian Railway in April 2015 to bring the Doodle to life.ĭuring the trip, Matt sketched out images of the Russian cities and countryside that would help form the foundation of the animation. ![]() Today’s Doodle celebrates the railway that helped link a nation with the world. Built over 26 years and completed a century ago, it remains a critical facet of Russian trade with Europe and China, and is a stalwart example of Russian engineering. In just seven days, the railway transports travelers and cargo from western Russia, across rocky tundra and frequently impassable countryside, all the way to the Pacific Ocean. From the country’s small villages to its big cities, Russia depends on the mighty Trans-Siberian Railway to traverse more than 6,000 miles and seven time zones between Moscow and Vladivostok. ![]()
0 Comments
Leave a Reply. |