Digital agriculture may be an important part of the solution to challenges facing U.S. agriculture, including rising production costs, climate change, and labor shortages, among others.
A new report issued today by USDA’s Economic Research Service, Precision Agriculture in the Digital Era: Recent Adoption on U.S. Farms, documents trends in the U.S. farm sector’s adoption of digital agriculture technologies between 1996 and 2019, with emphasis on changes since 2016, using data from USDA’s Agricultural Resource Management Survey.
Here are a few key findings from the report:
- A majority of row crop acreage is managed using auto-steer and guidance systems: Auto-steer guidance systems were used on only 5.3% of planted corn acres in 2001, growing to 58% in 2016.
- Adoption rates vary by farm size: At least half of relatively large row crop farms (those at or above the third quintile of acreage, i.e., with at least 60% of fields on farms with lower acreage) rely on yield maps, soil maps, variable rate technologies, and/or guidance systems.
- Farmers are likely to use precision agriculture technologies for a variety of reasons: As technological capabilities continue to evolve, so have farmers’ rationales for their use. For example, corn and winter wheat farmers tend to rely on yield monitors to track crop moisture content. By contrast, yield monitors are primarily used to help determine chemical input use in cotton, soybeans, and sorghum production.
For more detailed information, please refer to the full report.
USDA Economic Research Service


Comments