top of page

OVERVIEW OF PROGRAM

​

MATERIALS REQUIRED

 

  1. Phone with camera

  2. Flir Infrared Camera

  3. Matlab Software

​

INPUTS

​

  1. Pick a room that you would like to grow a plant in

    • ​Room must have a window that receives at least some natural lighting

  2. Time lapse of the room by capturing two images of the room at two different perspectives

    • ​One image position should be perpendicular to the window (e.g. at the whiteboard in the image below)

    • Another image position should be at a 45 degree angle difference from the first image position without capturing any window views (e.g. in the bottom left corner of the room between the projector and window in the image below​​)

    • For each position, use the phone to capture the standard color image and use the Flir Infrared camera to capture the thermal data

    • Capture the images within 5-20ft from the window/natural light source and from 8ft above the floor

    • Capture images in the same locations every hour from 10:00am to 5:00pm with all artificial light sources turned off

    • Take note of the minimum and maximum temperature range provided by the Flir to ensure it has been calibrated correctly â€‹(e.g. 40-50F degrees in a temperature controlled building would (probably) indicate the device calibration button needs to be pressed)

​

​

​

​

​

​

​

​

​

​

​

OUTPUTS

​

One pair of the original color images will be outputted from the Matlab program with three rectangular boxes imposed on the image. These boxes indicate the optimal region in the room to place a specific plant. Two examples of our program's output are provided in a later section.

​

PLANT SPECIES CONSIDERED

​

We narrowed the scope of our project to only consider three plant species: succulents, parlor plum, and yellow mum. Each of these plants are unique by the amount of sunlight, sunlight intensity, and temperature range needed to survive:

​

​

​

​

 

 

 

 

 

 

 

 

 

 

 

 

 

 

​

 

DATA COLLECTION

​

GROUND TRUTH

​

Some sampled images captured can be found in the Data tab which include color and infrared data of three different locations. Collection 1 establishes the ground truth of an outdoor setting. Collections 2, 3, and 4 establish the ground truth of various building rooms.

​

PROJECT LIMITATIONS AND INVALID CASES 

​

Our project could be extended to account for various environments. The array of photos under the Data tab are meant to represent all the different situations a person would encounter when trying to place a plant: outside, classroom (simulating a bedroom or conference room), confined room (simulating an office space), and a living room (simulating rooms in a house). It is important to note the several scenarios when the environmental conditions can be misinterpreted and, as a result, our program would provide an ineffective placement of a plant. Collection 5, under the Data tab, exemplifies the extreme scenario where the images are taken at night with artificial lighting.

​

Also, we recognize that our algorithm does not hold true under several circumstances:

  1. The input photos must be taken while there is natural sunlight:

    • Any data inputted where this is no sunlight, artificial or fluorescent lighting, our program will not output an accurate placement of a plant for healthy development.

  2. The user chosen to input photos on a cloudy day reflecting limited sunlight in a normally very sunny season:

    • This data collection would be misleading. If the room regularly receives around X hours of sunlight a day, one day of collected data would show significantly less hours of sunlight.

    • This example could apply to all sudden and inconsistent weather changes.  

  3. Glare and reflections of lighting:

    • Surfaces that are highly reflective may be misinterpreted as a light source from the program such as a whiteboard. However, this reflection of light does provide some valuable information to the program. This given spatial region must be receiving natural lighting, thus, it should be taken into consideration.

​

ANALYSIS TECHNIQUES

​

A detailed step-by-step explanation of the image processing tools utilized in our project can be found under the Data Analysis tab. The developed code can be found under the Matlab code tab. Below lists the in-class and out-of-class image signal processing tools our group investigated and/or used. 

​

IN-CLASS DSP TOOLS

​

1. Linear Systems

The moving average filter is a system that averages the last M input points:

 

 

 

​

​

​

where x[n] is the input, T is the filter, and y[n] is the output of the system. With this expression, we can conclude that the moving average filter is a linear system by setting the input as the superposition of two signals x1[n] and x2[n]: 

​

​

​

​

​

​

​

​

​

​

​

Since the left and right side expressions are equal, this filter is a linear system. On the other hand, the threshold filter applied to the images is a non-linear system. The non-linear behavior of this filter can be recognized with the following example: if individual pixels X and Y are slightly below the threshold, then a pixel obtained from adding X and Y would be above the threshold.

​

2. FIR (Finite Impulse Response) and IIR (Infinite Impulse Response)

​

A FIR filter is a system whose impulse response is of finite duration. This means that the output response to a impulse input settles to zero outside some finite range. Conversely, an IIR filter has feedback which results in infinite non-zero values. The moving average filter utilized in our project has a 2D rectangular pulse FIR:

​

​

 

​

​

3. Change of Basis: RGB (Red Green Blue) and HSV (Hue Saturation Value)

​

Images can be converted to new bases for analysis, manipulation, and extraction of data. Two bases our group used in the project were the RGB and HSV (shown by images below) in order to extract ideal temperature regions from the sampled images. For more information, please see the explanations under the Data Analysis tab. 

 

 

 

 

 

 

 

 

 

 

 

 

​

Image source: https://www.quora.com/What-are-the-differences-between-RGB-HSV-and-CIE-Lab

​

​

OUT-OF-CLASS DSP TOOLS

 

As apart of our Digital Signal Processing course, we were encouraged to explore analysis techniques that were not covered throughout the semester. To identify the spatial alignment between two image faces of a room, we considered SIFT, ASIFT, and HOG.

 

1. SIFT and ASIFT

The Scale Invariant Feature Transform (SIFT) algorithm is a computer vision and image processing technique that detects local regions or features in two images. The Affine-SIFT algorithm goes a step further than the SIFT feature by accounting for varying camera axis orientations of the two images captured. Thus, for the interest of our project, the Affine-SIFT algorithm is more applicable.

​

2. HOG

The HOG (Histogram of Oriented Gradients) descriptor is another technique used for object detection in an image. This technique divides an image into small squared regions, normalizes the resulting image against all the squared regions, and forms a histogram of gradient directions for each of those pixel block regions. An example of the HOG applied to an image is shown below.

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

 

 

Additionally, we learned that the HOG feature could be used to detect corners of objects in a room. We investigated this feature more to figure out if it could be used to calculate a range of depth of a given room. An example is shown below where the HOG feature is applied to two images captured at a 45 degree difference. In review of the green key-points, we realized they were not very consistent across the two images. Our group found this object detection method interesting, however, we did not find it useful to our application.

 

 

 

 

 

 

 

 

 

 

 

 

​

​

​

PROGRAM OUTPUTS

​

Of the three different room environments we collected data on, we ran our program to obtain an output. Below demonstrates two implementations of our program.

​

1. Single Image

This room was the initial run through of our program where we only used one time frame of data. These images were taken at a 90 degree angle from each other. Visually, we can see that the overlapping of the two yellow rectangle boxes on the images display a similar spatial region of where a plant should be placed. From observation of the sampled data, this output seems reasonable. It is important to note that the plant considered for this output was one that required the brightest light and warmest temperature area of the room. 

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

2. Time Lapse of South Facing Room

​

The outputted images below display blue rectangular boxes to indicate where a medium sized yellow mum, parlor plum, and succulent should be placed. This output seems valid since the yellow mum requires more sunlight than the parlor plum and succulent so its box should be closer to the window. Additionally, the succulent's suggested placement is the furtherest to the left of the room which is reasonable since this plant type does not desire much amount and intensity of sunlight. The parlor plum is the median between the mum and succulent in regard to sunlight and temperature requirements, thus, it is reasonable that its suggested plant placement is somewhere in between the two. 

​

​

​

​

​

​

​

​

​

​

​

​

​

VALIDATION

​

In order to prove that the infrared images outputted accurate temperature readings, we placed temperature sensors around the sampled room as displayed in the room layout figure below. This room is the "Time Lapse of South Facing Room" demo discussed in the previous section. The placement of these sensors in this room are indicated with numbers from 1 to 7 which includes colder areas by the window, corners of the room, optimal plant places based on our hypothesis, and the location our program had recommended.

​

​

​

​

​

​

​

​

​

​

The temperature readings collected from 1:00pm to 5:00pm shown in the image below. It is important to note that there are heaters surrounding the windows. Thus, position 4 temperatures were higher than position 3. The rest of the sensor temperature data (1,2, 4-7) stay within 71F to 73F. This data validates that the temperatures do not significantly vary in the sampled room and, thus, would not significantly contribute to the recommended plant position. 

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

TEAM MEMBER'S CONTRIBUTION

​

Below lists the main project contributions of each team member:

​

Adrianna Pierce

  • Moving average filtering to produce pixelated image

  • Plant placement indicator with rectangular box imposed on original image

​

Maria Karstens

  • Colored and infrared image processing with RGB and HSV change of basis: normalization, grayscale, threshold filter for identification of ideal light and temperature regions

  • ASIFT (Affine Scale Invariant Feature Transform) to identify matching areas of a room

​

Shivani Shah

  • Time lapse data collection and overlapping of colored and infrared images

  • Matrix matching of colored and infrared images by resolution reduction

​

FUTURE IMPROVEMENTS

 

If we were not limited to the time constraints of this class project, there are a few improvements we would have made. Firstly, we would have liked to increase the diversity of plant species that the user would be able to choose from. Instead of being limited to a choice of three plants, the user would have the choice of hundreds of plants. With that, we would have collected daily time lapses over the course of multiple months and combine our findings with known weather patterns. We would tailor the choice of plants depending on the season.

 

In regards to the outputted image, we currently have a fixed size box placed onto the colored inputted image. We realize that not all users will be taking a photo with the same scaled proportions. In order to solve this issue we would add an analysis feature that would scale the size of the box according to the size of the existing objects in the room or provide an option for the user to input the dimensions of the sampled room.

 

Lastly, we would also like to have a user interaction, preferably in the form of a mobile application. This could allow the user to simply attach a temperature sensor, open the app, take a few pictures of a room where they would like to add a plant, choose the plant of their choice, and wait for the output.

​

References: 

Analysis expressions taken from Professor Laura Balzano U of M EECS 351 lecture notes. 

HSV and RGB: https://www.mathworks.com/help/images/convert-from-hsv-to-rgb-color-space.html

HOG feature: http://scikit-image.org/docs/dev/auto_examples/plot_hog.html

ASIFT feature: http://www.ipol.im/pub/art/2011/my-asift/

​

Succulent:

Low Lighting 2-3hrs

40-90F

Morning Light Intensity

​

Parlor Plum:

Medium Lighting 3-6hrs

50-70F

Adaptable Light Intensity

Yellow Mum:

High Intensity Lighting >6hrs

60-75F

Adaptable Light Intensity

bottom of page