Stage 1

Problem

Maintaining the health of household plants is a difficult task for many hobbyist plant owners. The species and sub-species of household plants are many, and care instructions and ailments vary accordingly. There exist online encyclopaedia-like websites with plant care instructions, apps that can determine the species of plant using an image, and – more recently – an app that can tell you what ails your plant based on a picture of your plant. There are also a few apps that help with tracking the watering schedule for your plants, although the schedule has to be set manually, and apps that determine the level of sunlight your plants are getting based on their location in your house relative to the sun’s current position in the sky. With such a vast array of apps available, the average plant hobbyist might find themselves lost between several apps for tracking the health of their plants. Moreover, each app includes many features, which would require the user to become familiar with the layout of the UI, even if they only need one of the app’s features. This is why we need one app that will manage all aspects of plant health in a user-friendly way.

Solution

Our proposed solution to the above problem is to design a mobile app featuring an AR status screen for each plant the phone camera sees. The status screen would offer a colour-based representation of the health of the plant, as well as the nickname and species of the plant. The status screen would be expandable upon being clicked to show more detailed information about the status of each aspect of plant health, like soil water saturation, amount of sunlight in a day, soil nutrient levels (fertilizer), and any diseases. The app would compute the status by extracting an image of the plant from the camera view, and performing a deep learning algorithm on the image to compare it with existing images of the same plant at varying health levels. Sunlight would be tracked based on the location of the phone compared to the supposed location of the sun based on the time of day. Water and fertilizer levels would need to be tracked based on user input as the visibility of soil in a plant pot is not guaranteed. It would be possible to automatically detect water levels based on the view of the soil in the case where soil is visible. Another option is to ask the user to select from a list of images of soil, which image their soil matches best. Once it is time to water or fertilize a plant, the app will send a reminder to the user (based on user preferences). As for the education part of plant care apps, our mobile app would feature a library of articles on the basics of plant care. It may point the user to a certain article from the plant status screen if it shows a particular issue. The library would also be searchable and the content would be grouped into hierarchies of information. The app would be distributed on app stores for Google and Apple and require permission to use the camera.

Stage One Deliverables

All of our deliverables can be found on our GitHub.
  • Team Contract
  • Online Repository
  • Online Portfolio
  • Three Project Ideas