top of page

Image processsing techniques were utilized in Matlab to analyze and interpret the data collected. Steps taken to determine the optimal plant plantment are described below.

​

1. Cropping and Resolution Matching of Color and Infrared Images

Since the camera of the iPhone and Flir device are different, the color and infrared images varied in width-height ratio. All of the color images needed to be cropped to the viewing range of the infrared images in order to ensure an accurate recommendation of plant placement. Additionally, the color images had higher resolution compared to the infrared images (1000x750 vs. 640x480 pixel matrix). To properly compare the individual pixel matrices for later processing of the two images we used the Matlab function imresize().

​

2. Image Processing

RGB and HSV

The color images were used to determine regions of optimal sunlight and the infrared images were analyzed by mapping the color range to a range of temperatures. 

​

The regions of a room with adequate sunlight were identified by converting the images to RGB basis, applying a greyscale filter to the original color images, and then applying a threshold filter. Each plant has a unqiue numerical threshold that represented the amount of sunlight intensity and sunlight it requires to live. For example, the yellow mum requires >8 hours of direct sunlight and the succulent only requires 2 hours of partial sunlight. In the top left corner image of the Yellow Mum Data Analysis and Succulent Data Analysis collections below, you can see that a significant amount of regions were filtered out for the yellow mum plant compared to the succulent. The filtered out pixel regions are black and the value of the pixels that passed through the filter are of the original image greyscale value. 

 

The Flir infrared images were then filtered to identify the regions of a room with adequate temperatures, the infrared image was converted to the HSV basis in order to intensify the saturation level of the image so that the warmer regions were more concentrated and defined. The infrared spectrum of colors displayed on the Flir varied depending on relative temperatures in a photo, thus, we used the provided temperatures from the Flir to calibrate our filters. Since the succulent has withstand a wider range o f living temperautures, it's filtered Flir image displays all white pixels which indiciates all temperatures in the room are acceptable for the succulent. On the other side, the yellow mum requires slightly warmer temperatures (60-75F), thus, the acceptable regions are limited. Then, the images can be multiplied together which means for every x- and y- axis coordinate on both images, the corresponding pixels will be multiplied together. The similar regions of ideal sunlight and temperature in the room were compared to filter out any invalid detection of light or heat sources. The processed image for the mum and succulent plant types are shown in the middle right section. 

​

Moving Average Filter

In order to more effectively differentiate between the light and shaded areas in the images and provide an accurate plant placement reommendation, we applied a moving average filter to the color images. We implemented this filter in Matlab by creating a function that steps through the x and y axis pixels and brakes up the image into rectangles the size of a medium sized plant. The pixels in each rectangle were averaged so that they would display a solid grayscale color which indicates the average amount of sunlight in that given area. A pixel value of 0 represents black and a pixel value of 1 represents white. The average color of each rectangle was then saved into a new image matrix. These processed images for the yellow num and succulent are shown in the lower left section of the image collections below. 

 

The rectangle with the largest value was then identified with yellow color to indicate where a plant should be placed. From this point, we imposed the yellow rectangle (given the x and y coordinates) onto the original image to achieve our final output. To account for different room environments, we wrote another function to serve as a moving average filter so we could account for both horizontal and vertical images, 480x640 and 640x480.

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

3. Time Lapse 

The color and infrared images captured throughout the day were overlapped to determine the amount of sunlight recieved in the room. As shown below, the darker shades of blue represent the regions in the room that recieved the most sunlight and the yellower shades represent fewer hours of direct sunlight. For our project, the program considers sunlight durations that fit the sunlight requirements of the plant species considered (listed under the Proposal page).

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

5. Depth Detection using ASIFT

Two face images of a room were captured to identify depth. ASIFT (Affine Scale Invariant Feature Transform) is a image processing technique used to detect objects between two images and accounts for varying camera axis orientations. Thus, ASIFT algorithm was used to gain a better spatial understanding of the room. The ASIFT feaure applied to the image below accurarely shows the detected objects in both images (white connecting lines). The images were taken at a 45 degree y axis orientation difference. 

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

We also tried applying the ASIFT feature to a variety of image pairs to see the level of accurately based on camera axis orientations.. The below images were also taken with a 45 degree difference, but demonstrated how fewer keypoints were able to be idenitfied. 

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

To further investigate the ASIFT algorithm, we used two images that were captured perpendicular to each other as shown below. Although some keypoints were identified, those keypoints are not correct. For example, the right side window in the right image is incorrectly showing a mtach to the window in the left image. When in fact, the connection should be with the left side window in the right image. This example demonstrates a failure case of the ASIFT object detection as well as shows the difference in angle orientation of two captured images must be < 90 degrees. 

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

 

​

​

​

​

REFERENCES

https://www.mathworks.com/help/images/functionlist.html

https://www.mathworks.com/matlabcentral/newsreader/view_thread/309606

https://www.mathworks.com/help/matlab/ref/rgb2hsv.html

https://en.wikipedia.org/wiki/Scale-invariant_feature_transform

Yellow Mum Data Analysis

Succulent Data Analysis

bottom of page