We have finally reached the end of the AI at the Zoo series and it’s time to start Testing in the Wild. In today’s post we will see some cute animal pictures and discuss how the app worked. Plus, some opportunities for improvement and next steps.
Off to the Zoo
We headed off to the Smithsonian National Zoo to test the app. Thankfully it was a bit cooler and rainy, so it was nice and quiet there. We visited all of the animals detected by the app: pandas, zebras, elephants, lions and tigers.
The app successfully identified zebras, elephants and lions. It was struggling with pandas which it seemed to think were lions. Additionally, the tiger was sleeping and so it thought that was a lion too.
Here are some of the pictures of the animals and the app results:










Areas for Improvement
There were a few areas of improvement that I noticed. First off, the ResetForm
command was missing from the submit meaning I could not submit another animal without restarting the app. That was an easy fix.
Next, of course the model needs more training especially for pandas. More images need to be added to the training set. Most of the training images were close ups of faces not showing the full panda. This may have caused the confused when taking photos of them further away.
It would also be really cool if we could take the pictures being taken and use them to continue training the model. Right now, we are saving we are saving the image and its label to Dataverse but nothing is being done with that information. This would be a larger project and would require the use of Azure Cognitive Services. With Cognitive Services, we can use code to add and tag new images and retrain the model so we could have our results added on a schedule.
Real World Applications
Now, I know that identifying animals is a bit of a silly application, but the goal was fun! Let’s talk about how this could be used in a real-life application. Several ideas are given on the Lobe Overview page such as identifying manufacturing defects, empty shelves, or classifying user submitted images.
When thinking about animal applications we can also think about conservation. Often cameras are used to watch areas where endangered animals may be visiting. AI can be used to analyze the footage and help us understand what was captured.
What other applications can you think of?
Thanks for joining me on this series! Let me know your questions, comments or other content you would like to see!
Check out the rest of the posts in the AI at the Zoo series or check out Events where you can find me presenting on AI or other Power Platform topics!
One thought on “AI at the Zoo: Testing in the Wild”