Blog

  • What Producers Need to Know About Futures Markets

    What Producers Need to Know About Futures Markets

    Authors: Will Maples, Mississippi State University, and Wendiam Sawadgo, Auburn University

    Many articles published by Southern Ag Today reference the futures market. Given its importance, it is worth taking a step back to review what a futures market is and why it matters for agriculture. Futures markets are one of the most important tools available to row crop producers for managing price risk. At their core, futures markets allow buyers and sellers to agree today on a price for a commodity that will be delivered at a future date. This differs from the cash (or spot) market, where commodities are bought and sold for immediate delivery.

    While futures markets have existed in various forms throughout history, the modern agricultural futures market began in Chicago in the late 1800s. It developed to address a core challenge in agriculture: sharp seasonal price swings. At harvest, abundant supplies pushed prices lower, while tighter supplies later in the marketing year drove prices higher. This made planning difficult for both producers and buyers. Early forward contracts helped, but still carried risk, as they were customized and depended on both parties honoring the agreement. Standardized futures contracts, traded on exchanges such as the Chicago Board of Trade, created a more reliable system with greater certainty of performance.

    In simple terms, a futures contract is a standardized agreement to buy or sell a specific quantity of a commodity at a set future date. Each contract defines the delivery time, quantity, and quality of the commodity. For example, a December corn futures contract represents 5,000 bushels of #2 Yellow corn for delivery in mid-December. The only element not specified is price, which is determined through trading on the exchange.

    Because price is determined through trading, futures markets play a central role in price discovery. Prices in these markets reflect the collective expectations of buyers and sellers for future supply and demand conditions. New information, such as changes in weather, yield expectations, exports, or policy, are quickly incorporated into futures prices. As a result, futures markets provide a transparent and forward-looking estimate of commodity values. For producers, these prices serve as a key reference point when making marketing decisions and evaluating potential profitability.

    One important point is that trading a futures contract does not involve the exchange of the physical commodity. Instead, what is being traded is the obligation to deliver or receive the commodity at a future date. These obligations can be offset prior to delivery. For example, a producer who sells a futures contract is guaranteeing delivery at a future date. The producer can then offset that position by later buying that same futures contract. Because positions can be offset, most futures trades do not result in physical delivery. This structure also allows individuals without direct access to the commodity to participate in the market. These participants, known as speculators, play an important role by providing liquidity and taking on the price risk of hedgers.

    Hedgers are individuals who buy or sell the underlying commodity and use futures markets to manage price risk. Row-crop producers fall into this category, as they produce the commodities underlying these contracts. For them, the futures market is a risk management tool rather than a speculation tool.

    Consider a soybean producer in May who plans to sell at harvest. That producer faces the risk of prices falling before October. By selling a November soybean futures contract in May, the producer can establish a price level. At harvest, the producer sells soybeans in the cash market and buys back the futures contract. Gains or losses in the futures position offset changes in the cash price, helping stabilize revenue. While the details of hedging are beyond the scope of this article, many Extension resources across the Southern Region provide additional guidance for using futures markets to manage price risk.

    For producers, the key is understanding how futures prices relate to local cash prices and how those signals can be used in a marketing plan. While no strategy guarantees the best price, using futures alongside tools such as forward contracts, crop insurance, and storage can help reduce downside risk and create more consistency in revenues. Taking time to understand how these markets work can put producers in a stronger position to make informed marketing decisions throughout the year.


    Maples, William E., and Wendiam Sawadgo. “What Producers Need to Know About Futures Markets.Southern Ag Today 6(15.3). April 8, 2026. Permalink

  • Hogs and Pigs Report Signals Limited Supply Growth

    Hogs and Pigs Report Signals Limited Supply Growth

    The latest USDA Quarterly Hogs and Pigs Report was released at the end of March and showed limited growth in the U.S. hog sector. Total inventory increased by 0.4 percent to just over 74.3 million head. This was within the range of pre-report expectations, but below the average pre-report estimates. Market hog inventories were also slightly higher, up 0.6 percent to 68.4 million head.

    The more notable shift occurred in the breeding herd, which declined 1.5 percent year-over-year to just under 5.9 million head. This was below pre-report expectations, and the decline suggests producers remain hesitant to expand aggressively. In addition, USDA revised the December 2025 inventory, which reinforced the signal of limited expansion. 

    Farrowings during the December 2025 to February 2026 quarter were also lower than expected and declined 1.5 percent. This was the opposite of average pre-report expectations for modest growth. Productivity gains helped offset reduced farrowings. Pigs per litter increased 2.1 percent to 11.9. This was stronger-than-expected and pushed the pig crop up 0.6 percent to 33.2 million head. 

    Looking ahead, farrowing intentions signal producers are being cautious with production changes. Producers plan a slight increase in farrowings for March to May (up 0.1 percent) but a decline for June to August (down 2.1 percent). Supplies are expected to remain adequate in the near term but tighten later in the year, which could provide some support to prices.


    Maples, Josh. “Hogs and Pigs Report Signals Limited Supply Growth.” Southern Ag Today 6(15.2). April 7, 2026. Permalink

  • Wage Aggregation Issues under the Old and New AEWR Methodologies

    Wage Aggregation Issues under the Old and New AEWR Methodologies

    Authors: Cesar L. Escalante, Joshua Emmanuel, and Naimul Bhuiyan

    On October 2, 2025, the U.S. Department of Labor (DOL) issued an Interim Final Rule (IFR) that introduced several modifications in the determination of adverse effect wage rates (AEWRs) applicable to H-2A workers hired by U.S. farmers.  The new methodology departs from relying on wage data from the Farm Labor Survey (FLS) and will instead set new AEWRs from wage information collected under the Occupational Employment and Wage Statistics (OEWS) survey of the Bureau of Labor Statistics.

    This article pursues the arguments presented in our previous SAT article,  which shows discrepancies in domestic-H-2A wage differentials at the national level due to the more aggregated nature of the old AEWRs.  The previous BLS-based AEWR setting mechanism was riddled with the aggregation issues pertaining to geography, commodity and job type, and static annual AEWR levels.

    The new AEWR setting scheme that H-2A employers are using this year resolves some of these aggregation issues.  Specifically, state-level AEWRs are now set using the OEWS survey data, thus resolving issues associated with regional aggregation under the old system. In addition, there will now be two wage tiers for H-2A workers established according to the workers’ skill levels.  

    • Level 1 – Entry Level AEWRs apply to positions with neither formal education nor specialized training requirements; and
    • Level 2 – Experienced Level AEWRs are set at higher rates “commensurate” to the skills, education, training, and/or experience requirements for such positions.

    Finally, an adverse compensation adjustment (downward) is also available for employers that provide housing for their H-2A workers. The adjustment rate is determined as the “equivalent hourly rate based on the weighted statewide average of Fair Market Rents for a four-bedroom housing unit available from the Department of Housing and Urban Development.” (Congressional Research Service, 2025)

    The new AEWR methodology’s intention to address wage aggregation issues is noteworthy.  However, analysts and industry experts only acknowledge its partial resolution of the old system’s imperfections. Issues concerning how the OEWS survey is conducted and addressing job and commodity wage differentiation still exist. The wage data used in the survey comes from state unemployment insurance (UI) records. Because most farms are not included in those records, the survey mainly captures farm labor contractors and businesses that support agriculture rather than farms themselves. This can bias the results toward states with more labor contractors and may leave out states where farms are exempt from UI reporting. 

    While the new two-tier wage system is based on skill level, it is still calculated by averaging wages across several different farm job types. This means wages for crop workers, livestock workers, and equipment operators are combined into one state average, even though those jobs typically pay different rates. In practice, crop and nursery workers, who make up most H-2A employees, are usually paid less than livestock workers and equipment operators. Because their wages are averaged together, the AEWR can end up higher than typical crop worker pay. Some argue that setting a separate AEWR specifically for crop and nursery workers would better reflect actual farm labor markets.

    All told, the recent changes in the AEWR setting policy are crucial, but merely intermediate steps in aligning AEWRs to wage market equilibrium conditions.  Meanwhile, the DOL must continue to evaluate the need for further reforms that scrutinize and consider possible business and economic repercussions of unresolved “wage aggregation” areas in the AEWR determination scheme.

    Table 1.  Mean Wages for Selected Standard Occupation Classification (SOC) Codes and Adverse Effect Wage Rates (AEWRs), By Farm Labor Region, Fiscal Year 2024

    RegionsMean Wages ($ per Hour)AEWR
    SOC 45-2091 (Equipment Operators)SOC 45-2092 (Crop, Nursery, Greenhouse)SOC 45-2093 (Farm, Ranch, Aquaculture)
    Appalachian I      21.03       17.66       19.24    15.81 
    Appalachian II      21.88       15.58       17.74    15.14 
    Cornbelt I      24.35       19.24       17.50    18.18 
    Cornbelt II      23.52       17.89       18.61    17.79 
    Delta      16.08       15.97       18.02    14.53 
    Lake      22.52       18.60       18.49    18.50 
    Mountain I      23.51       19.15       19.05    16.54 
    Mountain II      21.65       17.99       19.27    16.63 
    Mountain III      17.52       16.73       18.35    16.32 
    Northeast I      23.08       19.71       20.42    17.80 
    Northeast II      22.46       18.27       18.65    17.20 
    Northern Plains      22.90       18.04       19.27    18.32 
    Pacific      20.65       18.39       19.97    19.25 
    Southeast      18.03       16.60       16.03    14.68 
    Southern Plains      17.71       15.90       16.64    15.55 
    Source:  U.S. Bureau of Labor Statistics
  • Data Centers and Their Implications for Rural Communities 

    Data Centers and Their Implications for Rural Communities 

    Authors:Kodjo Barnor, Doctoral Student, Texas A&M University & Chrystol Thomas, Assistant Professor & Extension Specialist, Texas A&M AgriLife Extension

    Data centers are moving into rural areas across the United States, hence competing with farmers and communities for land, water, and energy. Understanding what these facilities require and what they don’t offer in return is essential for rural communities. The key question is whether data centers complement or compete with agricultural operations. The answer depends largely on where they are located and how they use electricity, water, and land resources.

    These facilities underpin artificial intelligence, cloud computing, and online services. Development is expanding rapidly across the United States, with Virginia and Texas becoming the most attractive locations due to their competitive electricity markets, abundant land, and growing renewable energy capacity (Figure 1). Texas has roughly 4 gigawatts (GW) of capacity, with almost 8 GW under construction, and could surpass Virginia as the largest global data center market by 2030 [1].

    Figure 1. 2025 year-end leased data center and hyper-owned capacity (GW).

    Note: From North America data center report year-end 2025, by A. Batson, 2026, Jones Lang LaSalle (JLL). https://www.jll.com/en-us/insights/market-dynamics/north-america-data-centers

    What Kind of Facilities Are We Talking About?

    Data centers vary widely in size. Small edge facilities serve local networks, while hyperscale facilities built by companies such as Amazon, Google, and Microsoft can span hundreds of acres and consume as much electricity as a small city (Table 1). These large facilities are increasingly locating in rural areas where land is available, and energy infrastructure can be expanded.

    Table 1: Major Data Center Types and Typical Characteristics

    TypeTypical SizePower DemandLandWater Use
    Hyperscale100,000+ sq ft100+ MW100–1,000+ acresUp to several million gal/day
    Cloud50,000–200,000 sq ft10–99 MW10–40+ acresModerate to high
    Enterprise10,000–50,000 sq ft1–10 MW5–15 acresThousands gal/day
    Edge/Micro<10,000 sq ft<1 MW<1 acreMinimal
    Notes: Source: Northeast Regional Center for Rural Development (NERCRD). 2026. Definitions are approximate and may vary by project design, location, technology, and operational requirements [2].

    Energy Use

    Unlike agricultural electricity use, which is seasonal, data centers operate continuously. A 100 MW facility running around the clock consumes roughly 2,400 megawatt-hours of electricity per day which is comparable to the daily electricity use of tens of thousands of households. In Texas, ERCOT forecasts that peak electricity demand could reach 218 gigawatts by 2031, compared with a record peak of 87 gigawatts in 2025 [3]. Data centers are expected to contribute significantly to this growth.

    For irrigators and other large electricity users, rising demand could mean higher electricity rates and greater grid stress during summer months when agricultural demand peaks. Large industrial loads may also require transmission upgrades, and rural electric cooperatives could face infrastructure costs that ultimately affect rates for agricultural customers.

    Water Use

    Many data centers rely on evaporative or hybrid cooling systems requiring significant water withdrawals. A facility using one million gallons per day consumes more than 1,100 acre-feet annually; a hyperscale campus can require several times that amount. In regions overlying the Ogallala Aquifer, the Trinity, the Carrizo-Wilcox, or other stressed groundwater systems, this additional demand is not trivial.

    Texas groundwater law follows the rule of capture, meaning landowners generally have the right to pump groundwater beneath their property unless regulated by local groundwater conservation districts. A data center developer purchasing land in your area has the legal right to withdraw substantial groundwater without compensating neighboring agricultural operations, unless local limits are in place. Producers should engage their local groundwater conservation district early in any proposed development process to understand what protections exist.

    Land Use

    Data centers compete with agricultural land uses on a long-term basis. Once developed, these sites rarely return to agricultural production. This is a tradeoff that deserves careful evaluation in communities where farmland values underpin local economies. While projects can generate property tax revenue and construction employment, they typically employ relatively few workers once operational. A large facility may require only a few dozen full-time employees. Also, some developers are pairing data centers with nearby solar or wind generation to meet sustainability targets and stabilize energy costs. These projects may create opportunities for landowners.

    Conclusion

    Data centers are becoming an increasingly visible feature of rural America. While they can generate local revenue, they also place continuous demands on electricity and groundwater resources used by agriculture. Communities considering these developments should evaluate key issues in advance: which aquifer will supply water and at what volumes, whether transmission upgrades are required and who will bear the costs, the long-term employment footprint, and whether the overall economic benefits outweigh potential pressures on local infrastructure and resources. Careful evaluation of these factors will help communities better manage the expansion of data infrastructure across rural areas.

    References

    [1] Batson, A. 2026. North America data center report: Year-end 2025. Jones Lang LaSalle (JLL). https://www.jll.com/en-us/insights/market-dynamics/north-america-data-centers.

    [2] Northeast Regional Center for Rural Development (NERCRD). 2026. Data Centers: Assisting Communities in Understanding the Challenges and Opportunities. Webinar presentation slides, February 24. Pennsylvania State University.

    [3] ERCOT. 2025. Long-Term Load Forecast Report. Electric Reliability Council of Texas.

    [4] Houston Advanced Research Center (HARC). 2026. Thirsty Data: Water Use and the Projected Data Center Boom in Texas.


    Barnor, Kodjo, and Chrystol Thomas. “Data Centers and Their Implications for Rural Communities.Southern Ag Today 6(14.5). April 3, 2026. Permalink

  • The Future of Food Aid

    The Future of Food Aid

    Bart L. Fischer and Joe Outlaw

    The United States has a long and storied history of providing food aid to those who are in need around the world. One of the earliest examples dates to 1812 when the U.S. government sent $50,000 of wheat flour to Venezuela following a devastating earthquake. U.S. international food aid efforts were formalized with passage of the Food for Peace Act in 1954, which sought to alleviate global hunger while also disposing of domestic agricultural surpluses. In total, the U.S. has consistently spent in excess of $4 billion per year on international food aid and is, by far, the world’s largest contributor.[1] That funding results in more than 1 million metric tons of U.S.-grown agricultural commodities—including corn, sorghum, rice and wheat— being shipped to recipient countries each year, serving as a consistent source of demand and helping to stabilize domestic prices.

    Over the last 200 years, policymakers have regularly debated the appropriate role of the U.S. government in providing international food aid.  For example, during Congressional debates in 1847 about whether to send aid in response to the Irish Potato Famine, President Polk threatened a veto, arguing that it was his “solemn conviction…that Congress possesses no power to use public money for any such purpose.”  For decades, policymakers have also debated the appropriate balance between food aid and development assistance—the proverbial “give a man a fish and you feed him for a day, teach a man to fish and you feed him for a lifetime.”  Both featured prominently in the development and passage of the Food for Peace Act over 70 years ago, and both remain a prominent feature of American global food security efforts today. 

    More recently, the debates over international food aid have been more technical in nature, largely focused on the efficiency of food aid (or the perceived lack thereof) and the impact of that aid on the recipient country.  On the latter, an amendment to the Food for Peace Act from Senator Henry Bellmon (R-OK) in the International Development and Food Assistance Act of 1977 required the U.S. government to assesses whether the recipient country has adequate storage facilities and whether assistance would interfere with the recipient country’s agricultural economy before shipping aid. 

    Despite these efforts, criticism of in-kind aid (i.e., U.S.-grown commodities) has resulted in a push for procurement of food aid in local/regional markets and for cash transfers for people to use in local shops. In 2010, the Obama Administration introduced a cash-based assistance program known as the Emergency Food Security Program (EFSP) that was designed to complement U.S.-grown commodities by allowing local/regional purchases along with cash transfers. Since then, the use of cash-based assistance has rapidly expanded, now accounting for more than half of international food aid provided by the United States. This has led to growing concerns among policymakers about a lack of accountability with cash-based assistance and the ease with which it can be misappropriated in recipient countries.

    This brings us to Spring 2025. As part of their efforts to improve government efficiency and ensure U.S. international efforts are yielding promised results, the Trump Administration eliminated the U.S. Agency for International Development which oversaw Food for Peace, absorbing those efforts into the broader State Department. In December 2025, USDA entered into an interagency agreement to take over the administration of Food for Peace.  In March 2026, the Committee on Agriculture in the U.S. House of Representatives marked up its farm bill proposal—the Farm, Food, and National Security Act of 2026—which would permanently transfer the authorities of the Food for Peace Act from USAID to USDA while reserving 50% of Food for Peace resources for U.S. grown commodities, returning the program “to its original intent of addressing the global hunger crisis through the purchase of U.S. grown commodities.”[2]

    For many in the agricultural community, the shift to local/regional procurement and cash-based food assistance—however well intentioned—was undermining the historic mission of Food for Peace. Moving Food for Peace to USDA presents an opportunity to refocus the program, helping ensure that it can survive another 70 years in fulfilling its dual mission of using domestic agricultural surplus to help feed those in need around the world.


    [1] https://www.gao.gov/international-food-assistance

    [2] https://agriculture.house.gov/uploadedfiles/final_2026_ffp_onepager.pdf


    Fischer, Bart L., and Joe Outlaw. “The Future of Food Aid.” Southern Ag Today 6(14.4). April 2, 2026. Permalink