Monitoring the feed consumption of a dairy cow can provide insight into both the animal's heath and how much milk the animal can produce. To date, a number of automated systems have been developed to perform this task. In the Roughage Intake Control (RIC) system from Insentec (Marknesse, The Netherlands; www.insentec.eu), for example, an RFID tag is used to identify animals as they approach the feeding gate, giving or denying them access to a feeding trough that is used to record feed intake after every visit.
"While useful," says Researcher Anthony Shelley of the University of Kentucky (Lexington, KY; www.uky.edu), "the physical barrier placed between the cow and feed interferes with the animal's natural feeding behavior. An ideal system would measure, control and monitor individual feed intake while not interfering with feeding habits or inhibiting workflow on a farm."
In a proof-of-concept system designed to accomplish this, Shelley and his colleagues used a Carmine 1.08 RGB+depth sensor from PrimeSense - now part of Apple Computer (Cupertino, CA, USA; www.apple.com) - to scan and record feed volume from which feed weight can be derived. By taking 3D measurements of food within a bin, the weight of food consumed by cattle can be accurately determined. This can then provide insight into both the animal's heath and how much milk the animal might produce.
"By recording 3D scans of a feed bin at various fill levels, computing the volume of feed present and measuring the resulting weight of the feed, a mapping from volume to weight can be derived," says Shelley.
In one experimental setup, the depth sensor was placed 100cm from the top of a feeding bin with a single image captured from directly above the center of the feed surface. In operation, a pseudorandom dot pattern from the near-infrared laser illuminator in the depth sensor is projected onto the surface of the feeding bin. The reflected image is then captured by the sensor's near-infrared camera to detect constellations of points within the image. The position of these constellations within the camera's field of view is then used to determine how far the target surface is from the sensor.
Changing the amount of food in the bin and taking multiple 3D measurements can then be used to make volume measurements of feed within the bin. This data can then be compared with the weight of the feed derived from traditional weighing methods. As each image is captured, the data is transferred over the depth sensor's USB interface to a host PC. To produce a 3D map of the surface of the feed in the bin, KScan3D software from LMI Technologies (Delta, BC, Canada; www.lmi3d.com) was used. After each scan, the KScan3D software was used to reduce the 3D point cloud data to approximately 80,000 data points or 10% of the original data, as this was all that was needed to accurately represent a 3D scan.
Regression analysis was then used to determine the correlation between the camera sensor depth measurements of the feed surface and the scale-measured weight values of the feed in the bin. The results showed a strong fit between the volumetric scan data and scale-measured weight values. According to Shelley, future studies will determine how the system functions in open feeding parlors with multiple cows entering and exiting throughout the day and how the animals' feeding behaviors affect system accuracy.
Shelley is also planning to extend the concept of 3D scanning in the development of a system for the automated inspection of living animals in an unmodified working environment. Using a Kinect V2 sensor from Microsoft (Redmond, WA, USA; www.microsoft.com), the system will automatically identify the tail of the animal as it walks under the camera, in a hallway, outside a milking parlor as well as track the plane of symmetry that divides the animal's body into left and right halves. With such developments, the requirement for RFID systems may, in future, not be required.