Blog

  • Data Centers and Their Implications for Rural Communities 

    Data Centers and Their Implications for Rural Communities 

    Authors:Kodjo Barnor, Doctoral Student, Texas A&M University & Chrystol Thomas, Assistant Professor & Extension Specialist, Texas A&M AgriLife Extension

    Data centers are moving into rural areas across the United States, hence competing with farmers and communities for land, water, and energy. Understanding what these facilities require and what they don’t offer in return is essential for rural communities. The key question is whether data centers complement or compete with agricultural operations. The answer depends largely on where they are located and how they use electricity, water, and land resources.

    These facilities underpin artificial intelligence, cloud computing, and online services. Development is expanding rapidly across the United States, with Virginia and Texas becoming the most attractive locations due to their competitive electricity markets, abundant land, and growing renewable energy capacity (Figure 1). Texas has roughly 4 gigawatts (GW) of capacity, with almost 8 GW under construction, and could surpass Virginia as the largest global data center market by 2030 [1].

    Figure 1. 2025 year-end leased data center and hyper-owned capacity (GW).

    Note: From North America data center report year-end 2025, by A. Batson, 2026, Jones Lang LaSalle (JLL). https://www.jll.com/en-us/insights/market-dynamics/north-america-data-centers

    What Kind of Facilities Are We Talking About?

    Data centers vary widely in size. Small edge facilities serve local networks, while hyperscale facilities built by companies such as Amazon, Google, and Microsoft can span hundreds of acres and consume as much electricity as a small city (Table 1). These large facilities are increasingly locating in rural areas where land is available, and energy infrastructure can be expanded.

    Table 1: Major Data Center Types and Typical Characteristics

    TypeTypical SizePower DemandLandWater Use
    Hyperscale100,000+ sq ft100+ MW100–1,000+ acresUp to several million gal/day
    Cloud50,000–200,000 sq ft10–99 MW10–40+ acresModerate to high
    Enterprise10,000–50,000 sq ft1–10 MW5–15 acresThousands gal/day
    Edge/Micro<10,000 sq ft<1 MW<1 acreMinimal
    Notes: Source: Northeast Regional Center for Rural Development (NERCRD). 2026. Definitions are approximate and may vary by project design, location, technology, and operational requirements [2].

    Energy Use

    Unlike agricultural electricity use, which is seasonal, data centers operate continuously. A 100 MW facility running around the clock consumes roughly 2,400 megawatt-hours of electricity per day which is comparable to the daily electricity use of tens of thousands of households. In Texas, ERCOT forecasts that peak electricity demand could reach 218 gigawatts by 2031, compared with a record peak of 87 gigawatts in 2025 [3]. Data centers are expected to contribute significantly to this growth.

    For irrigators and other large electricity users, rising demand could mean higher electricity rates and greater grid stress during summer months when agricultural demand peaks. Large industrial loads may also require transmission upgrades, and rural electric cooperatives could face infrastructure costs that ultimately affect rates for agricultural customers.

    Water Use

    Many data centers rely on evaporative or hybrid cooling systems requiring significant water withdrawals. A facility using one million gallons per day consumes more than 1,100 acre-feet annually; a hyperscale campus can require several times that amount. In regions overlying the Ogallala Aquifer, the Trinity, the Carrizo-Wilcox, or other stressed groundwater systems, this additional demand is not trivial.

    Texas groundwater law follows the rule of capture, meaning landowners generally have the right to pump groundwater beneath their property unless regulated by local groundwater conservation districts. A data center developer purchasing land in your area has the legal right to withdraw substantial groundwater without compensating neighboring agricultural operations, unless local limits are in place. Producers should engage their local groundwater conservation district early in any proposed development process to understand what protections exist.

    Land Use

    Data centers compete with agricultural land uses on a long-term basis. Once developed, these sites rarely return to agricultural production. This is a tradeoff that deserves careful evaluation in communities where farmland values underpin local economies. While projects can generate property tax revenue and construction employment, they typically employ relatively few workers once operational. A large facility may require only a few dozen full-time employees. Also, some developers are pairing data centers with nearby solar or wind generation to meet sustainability targets and stabilize energy costs. These projects may create opportunities for landowners.

    Conclusion

    Data centers are becoming an increasingly visible feature of rural America. While they can generate local revenue, they also place continuous demands on electricity and groundwater resources used by agriculture. Communities considering these developments should evaluate key issues in advance: which aquifer will supply water and at what volumes, whether transmission upgrades are required and who will bear the costs, the long-term employment footprint, and whether the overall economic benefits outweigh potential pressures on local infrastructure and resources. Careful evaluation of these factors will help communities better manage the expansion of data infrastructure across rural areas.

    References

    [1] Batson, A. 2026. North America data center report: Year-end 2025. Jones Lang LaSalle (JLL). https://www.jll.com/en-us/insights/market-dynamics/north-america-data-centers.

    [2] Northeast Regional Center for Rural Development (NERCRD). 2026. Data Centers: Assisting Communities in Understanding the Challenges and Opportunities. Webinar presentation slides, February 24. Pennsylvania State University.

    [3] ERCOT. 2025. Long-Term Load Forecast Report. Electric Reliability Council of Texas.

    [4] Houston Advanced Research Center (HARC). 2026. Thirsty Data: Water Use and the Projected Data Center Boom in Texas.


    Barnor, Kodjo, and Chrystol Thomas. “Data Centers and Their Implications for Rural Communities.Southern Ag Today 6(14.5). April 3, 2026. Permalink

  • The Future of Food Aid

    The Future of Food Aid

    Bart L. Fischer and Joe Outlaw

    The United States has a long and storied history of providing food aid to those who are in need around the world. One of the earliest examples dates to 1812 when the U.S. government sent $50,000 of wheat flour to Venezuela following a devastating earthquake. U.S. international food aid efforts were formalized with passage of the Food for Peace Act in 1954, which sought to alleviate global hunger while also disposing of domestic agricultural surpluses. In total, the U.S. has consistently spent in excess of $4 billion per year on international food aid and is, by far, the world’s largest contributor.[1] That funding results in more than 1 million metric tons of U.S.-grown agricultural commodities—including corn, sorghum, rice and wheat— being shipped to recipient countries each year, serving as a consistent source of demand and helping to stabilize domestic prices.

    Over the last 200 years, policymakers have regularly debated the appropriate role of the U.S. government in providing international food aid.  For example, during Congressional debates in 1847 about whether to send aid in response to the Irish Potato Famine, President Polk threatened a veto, arguing that it was his “solemn conviction…that Congress possesses no power to use public money for any such purpose.”  For decades, policymakers have also debated the appropriate balance between food aid and development assistance—the proverbial “give a man a fish and you feed him for a day, teach a man to fish and you feed him for a lifetime.”  Both featured prominently in the development and passage of the Food for Peace Act over 70 years ago, and both remain a prominent feature of American global food security efforts today. 

    More recently, the debates over international food aid have been more technical in nature, largely focused on the efficiency of food aid (or the perceived lack thereof) and the impact of that aid on the recipient country.  On the latter, an amendment to the Food for Peace Act from Senator Henry Bellmon (R-OK) in the International Development and Food Assistance Act of 1977 required the U.S. government to assesses whether the recipient country has adequate storage facilities and whether assistance would interfere with the recipient country’s agricultural economy before shipping aid. 

    Despite these efforts, criticism of in-kind aid (i.e., U.S.-grown commodities) has resulted in a push for procurement of food aid in local/regional markets and for cash transfers for people to use in local shops. In 2010, the Obama Administration introduced a cash-based assistance program known as the Emergency Food Security Program (EFSP) that was designed to complement U.S.-grown commodities by allowing local/regional purchases along with cash transfers. Since then, the use of cash-based assistance has rapidly expanded, now accounting for more than half of international food aid provided by the United States. This has led to growing concerns among policymakers about a lack of accountability with cash-based assistance and the ease with which it can be misappropriated in recipient countries.

    This brings us to Spring 2025. As part of their efforts to improve government efficiency and ensure U.S. international efforts are yielding promised results, the Trump Administration eliminated the U.S. Agency for International Development which oversaw Food for Peace, absorbing those efforts into the broader State Department. In December 2025, USDA entered into an interagency agreement to take over the administration of Food for Peace.  In March 2026, the Committee on Agriculture in the U.S. House of Representatives marked up its farm bill proposal—the Farm, Food, and National Security Act of 2026—which would permanently transfer the authorities of the Food for Peace Act from USAID to USDA while reserving 50% of Food for Peace resources for U.S. grown commodities, returning the program “to its original intent of addressing the global hunger crisis through the purchase of U.S. grown commodities.”[2]

    For many in the agricultural community, the shift to local/regional procurement and cash-based food assistance—however well intentioned—was undermining the historic mission of Food for Peace. Moving Food for Peace to USDA presents an opportunity to refocus the program, helping ensure that it can survive another 70 years in fulfilling its dual mission of using domestic agricultural surplus to help feed those in need around the world.


    [1] https://www.gao.gov/international-food-assistance

    [2] https://agriculture.house.gov/uploadedfiles/final_2026_ffp_onepager.pdf


    Fischer, Bart L., and Joe Outlaw. “The Future of Food Aid.” Southern Ag Today 6(14.4). April 2, 2026. Permalink

  • Southern Acreage Shifts Toward Soybeans and Cotton in 2026 Planting Intentions

    Southern Acreage Shifts Toward Soybeans and Cotton in 2026 Planting Intentions

    The USDA released the 2026 Prospective Plantings Report on March 31. Nationally, producers intend to plant fewer corn acres and more soybeans and cotton. Corn acreage is projected at 95.3 million acres, down 3.5 percent from the 2025 actual planted acreage, while soybean acreage is expected to increase 4.3 percent to 84.7 million acres. Cotton acreage is also forecast higher, up 3.8 percent to 9.64 million acres. These shifts reflect relative price signals and input cost considerations that have increasingly favored soybeans and cotton over corn in many areas.

    In the Southern region, prospective plantings generally follow national trends, though the magnitude varies across states. Texas, the largest corn producing state in the South, is projected to increase acreage by 4 percent to 2.6 million acres. Kentucky, the second largest corn state in the region, is projected to reduce acreage by 4.6 percent, to 1.45 million acres.  Meanwhile, the third largest corn state in the region, Tennessee, is projected to increase acreage by 7.5 percent, to 1.0 million acres. Mississippi is projected to see the largest decline, down 31 percent to 630 thousand acres, followed by Arkansas, down 27 percent to 590 thousand acres.

    For soybeans, the Southern region is projected to increase planted acreage by 10 percent, led by Arkansas and Mississippi. Arkansas is projected to plant 3.1 million acres, up 19.7 percent from last year, while Mississippi is expected at 2.3 million acres, up 27.1 percent. Kentucky, North Carolina, and Tennessee round out the top five soybean states in the region, with acreage changes of 2.8 percent, 4.3 percent, and no change, respectively.

    For cotton, the Southern region is projected to increase planted acreage by 4 percent to 8.94 million acres. Texas continues to dominate regional production, with acreage rising 3.7 percent to 5.52 million acres. Several states are projected to see notable gains, including Louisiana, up 22.2 percent, Tennessee, up 22 percent, North Carolina, up 19.3 percent, Oklahoma, up 15.4 percent, and Georgia, up 7.8 percent. In contrast, acreage declines are expected in Arkansas, Mississippi, and Virginia, down 9.6 percent, 9.1 percent, and 4.1 percent respectively.  Meanwhile Alabama and South Carolina are projected to see no change.

    For rice, the Southern region is projected to see a sharp reduction in planted acreage, declining 21 percent to 1.64 million acres. Arkansas remains the largest rice producing state, though acreage is projected to fall 22 percent to 1.0 million acres. Mississippi shows the largest percentage decline, down 51 percent to 80 thousand acres. Louisiana and Texas are also projected to reduce acreage by 10.8 percent and 13.8 percent, respectively. Overall, the decline reflects weaker market signals and continued input cost pressure across the region.

    A large reduction is also expected in peanut acres, with the Southern region projected to reduce planted acreage by 14.4 percent to 1.65 million acres. Georgia remains the dominant producer, though acreage is projected to decline 15.2 percent to 780 thousand acres. All but Alabama are expected to see reductions, with Arkansas, South Carolina, and Texas all down more than 20 percent, and Mississippi, down 43 percent. Smaller declines are expected in Florida, North Carolina, Oklahoma, and Virginia.  Alabama peanut acres are expected to increase 2.6 percent.

    It is important to remember that this report reflects prospective plantings rather than final acreage. These estimates are based on producer surveys conducted in early March and are subject to change. A number of factors could influence actual planted acreage moving forward. In particular, these estimates likely do not fully capture the potential impact of the Iran conflict on fertilizer prices and planting decisions. Weather also plays a critical role in determining how much acreage gets planted. For producers, these figures serve as an early benchmark of current expectations that can be used when evaluating how evolving market conditions and weather may shift final planted acreage.

    Table 1. 2026 Prospective Planting Acreage in the Southern Region (thousand acres)

    Note: Acres are reported in thousand acres. Percent change represents 2026 intended plantings relative to 2025 actual planted acres. Dashes indicate negligible or unreported acreage.
    Source: USDA Prospective Planting Report, March 31, 2026.

    Maples, William E. “Southern Acreage Shifts Toward Soybeans and Cotton in 2026 Planting Intentions.” Southern Ag Today 6(14.3). April 1, 2026. Permalink

  • Price Relationships of Beef X Dairy Calves and Dairy Calves

    Price Relationships of Beef X Dairy Calves and Dairy Calves

    Authors: Charley Martinez, Parker Wyatt, and Eli Mundy

    Over the last few years, the Beef X Dairy (BxD) markets have gained attention due to the rise in BxD prices. Day-old calves (80-90 pounds) have gone from $50 per head a few years back to recent Pennsylvania auction data showing 80-89 pounds BxD calves averaged $1706.21 per head for the week ending March 21st. A question that has been asked recently is centered around the value of purebred dairy calves compared to BxD calves. For the same week, 80-89 pound purebred dairy calves averaged $1329.60 per head, a difference of $376.60. This SAT examines the premium over time for BxD calves over purebred dairy calves. 

    The United States Department of Agriculture’s (USDA) weekly New Holland Pennsylvania market report currently provides the most extensive and consistent information regarding the price of BxD and purebred dairy calves. Very few markets report BxD calves separately, and there may be variation in local prices influenced by regional differences in demand, transportation cost, and buyer composition. Figure 1 shows the price premium received for BxD calves over purebred dairy calves at the Pennsylvania market for the previous 5-year average (2020-2024), 2025, and 2026.  

    Figure 1. Monthly Premium for BxD Calves Over Purebred Dairy Calves ($/head), New Holland, PA.

    Data from USDA-AMS

    The chart shows a consistent premium per head for BxD calves relative to purebred dairy calves with increased premiums during August through October. During 2020–2024 (thick red line), the average premium ranged from about $130 (first quarter) to $200 per head (August-October). In 2025 (dotted line), the premium was notably higher than the previous 5-year average, beginning the year at approximately $255 per head in January, climbing through the spring, and reaching its highest levels of about $450–$470 per head in early fall, before declining toward year‑end. In 2026 (thin solid blue line), there has been an even larger premium than last year, approximately $340 in January and over $420 per head by February. This suggests an exceptionally strong relative demand for BxD calves compared to dairy calves during that period. 

    With no clear signs of increases in the beef calf crop, the premiums for BxD calves can be expected to remain strong or continue to increase. These premiums reflect tight supplies of feeder cattle and a willingness from feedlot operators to pay for calves that offer improved feed efficiency, growth performance, and carcass characteristics relative to purebred dairy calves. This raises a few important questions: how high can the premium go, and what happens when traditional beef cattle numbers begin to rise? If the beef herd expansion is relatively gradual, BxD calves may retain a meaningful premium and a strong role in the supply chain. Conversely, rapid rebuilding of the beef herd could result in a faster narrowing of the price differential, particularly if feedlot operators shift back quickly to traditional beef calves. Ultimately, the long-run trajectory of BxD premiums will hinge on the rate of supply growth, their ability to compete with traditional beef calves on feedlot efficiency, and downstream demand for beef. 


    Martinez, Charley, Parker Wyatt, and Eli Mundy. “Price Relationships of Beef X Dairy Calves and Dairy Calves.Southern Ag Today 6(14.2). March 31, 2026. Permalink

  • What Lower Interest Rates Mean for 2026 Budgets

    What Lower Interest Rates Mean for 2026 Budgets

    Following the benchmark rate reductions in 2025, the Federal Open Market Committee (FOMC) left the federal funds target rate unchanged at 3.50–3.75% at its March 2026 meeting (Federal Reserve, 2026). While benchmark rates have come down from post-pandemic highs, borrowing costs continue to remain elevated compared to the low-interest-rate environment before 2022. For farmers across the country, this poses a significant challenge, as interest expenses remain a notable portion of crop budgets while trying to balance the drastic increase in operating expenses. The expectation of near-zero rates might be unrealistic, but it’s important to highlight that even with lower rates, interest expense continues to contribute to the on-farm price-cost squeeze.

    Table 1 is derived from a previous article (see Loy, 2023) and updated to reflect an average budget for a Midsouth corn, cotton, rice, or soybean farmer in 2026. Interest expenses are based on the average fixed operating loan rates from the Federal Reserve Bank of Kansas City Agricultural Credit Survey. Operating loan terms are assumed to have a 9-month payback period and include select 2026 pre-harvest production expenses.

    Table 1. Southern Region, Select 2026 Pre-Harvest Production Expenses ($/acre)

     CornCottonRice Soybeans
    Seed$125.00$113.00$130.00$91.00
    Fertilizer$360.00$290.00$258.00$117.00
    Pesticides$54.00$205.00$118.00$93.00
    Fuel $29.00$52.00$104.00$64.00
    Operating Interest Expenses at varying rates:    
        Q1 2026 (7.20%)$29.93$34.78$32.15$19.24
        Q1 2025 (7.50%)$31.98$37.16$34.34$20.55
        Q1 2024 (8.20%)$34.19$39.73$36.72$21.97
        Q1 2023 (7.43%)$31.64$36.76$33.98$20.33

    Note: Operating Interest Expense assumes a 9-month term (e.g., 7.20% * (9/12) * principal borrowed)

    Table 1 illustrates that the benchmark rate reductions, when applied to this year’s production expenses, have provided little relief. Estimated interest expenses for 2026 are down marginally compared to the peak rate of 2024, when operating costs would have generated about $4.26, $4.95, $4.58, and $2.74 more interest expenses per acre for corn, cotton, rice, and soybeans, respectively. However, the reductions are even more modest compared to interest costs in 2023 and 2025.

    Overall, while interest expenses have eased, it remains a meaningful part of pre-harvest production planning. Recent benchmark rate reductions have provided some relief, but razor-thin on-farm margins persist. At the same time, higher input costs to grow the same crop have increased the amount that must be financed, potentially offsetting some of the benefits from a lower interest rate environment.      

    References

    Board of Governors of the Federal Reserve System, Federal Open Market Committee. 2026. Federal Reserve Press Release, January 28, 2026. Retrieved from, https://www.federalreserve.gov/monetarypolicy/files/monetary20260128a1.pdf

    Federal Reserve Bank of Kansas City. 2026. Federal Reserve Ag Credit Survey. Retrieved from, https://www.kansascityfed.org/center-for-agriculture-and-the-economy/agricultural-data-and-indicators/

    Loy, R. 2023. The Federal Funds Rate Impact on Agricultural Lending. Southern Ag Today 3(34.3). Retrieved from, https://southernagtoday.org/2023/08/23/the-federal-funds-rate-impact-on-agricultural-lending/


    Loy, Ryan. “What Lower Interest Rates Mean for 2026 Budgets.Southern Ag Today 6(14.1). March 30, 2026. Permalink