Written by Matt DiLeo
The “perennial grain” story seems to pop up every few months. The basic idea is that perennial crops would have higher yields and lower environmental impacts than their annual kin.
The picture on the left explains pretty clearly why – large permanent root systems secure the topsoil, exhaustively scavenge water and nutrients and support more vigorous shoot growth over a longer season.
This week, it’s perennial maize.
One thing that I think is funny about these stories is that they inevitably herald the accompanying freedom from multinational seed companies. Aside from the fact that no farmer’s forced to buy seed, I don’t see any reason why companies wouldn’t jump on a perennial grain bandwagon.* Companies like Monsanto are already selling/developing advanced varieties (e.g. Bt/Roundup) of perennial crops like alfalfa and sugarcane.**
Companies don’t have to sell seed every year to make money. In this case, I’ve heard they’ll be offering an annual license agreement (e.g. you buy the seed the first year and pay a license fee each following year that you continue to cultivate the crop). I think it shows a lack of creativity that people always pin the blame on what they don’t like about the state of agriculture on the “need” for companies to sell seed every spring. There’re lots of ways to do business.
Which isn’t to say I’d expect companies to make the initial investment – jumpstarting speculative new technologies and industries is the role of governments and non-profits. According to Ed Buckler (in the article), an easy $15-30 million should do this. This is an inconsequential speck in the U.S. budget and we should probably just get it done.
h/t: Agricultural Biodiversity Weblog
* I heard that a coalition of wheat farmers actually petitioned companies like Monsanto to begin reinvesting in transgenic wheat varieties since wheat yields have fallen so far behind other crops like maize.
** Pest and herbicide resistance is particularly valuable in perennial systems as bugs and weeds tend to build up over years without tilling or rotation.
*** It might also be a concern how you can maintain a high genetic gain in yield year-by-year and decade-by-decade when you switch an annual to a perennial, but I imagine the gains inherent to perennialism paired with our incredible current breeding technology should make this well worth it.
Written by Guest Expert
Matt DiLeo has a PhD in Plant Pathology from UC, Davis. During his postdoctoral research at Boyce Thompson Institute, he researched unintentional effects of genetic engineering. Matt builds R&D teams and biotech platforms: genome editing, gene discovery, microbials, and controlled environment agriculture.
An easy $15-$30M?
Doubtful – even if you could perennialize maize you’d probably then also have to completely rebreed the plant to perform well, and at the end of the day have something that doesn’t do as well as annual Maize – annual Maize puts all its resources into producing as much grain as it possibly can – there is no reserve of energy to survive the harsh winter and restart next season, corn plants essentially suck themselves dry come the end of the season (roots and all) to allocate all resources to the seed – so you’d have to assume that in establishing a deep root system (deeper and more extensive than it has anyway – corn is no slouch in this department anyway) you’d take a big hit in first year yield, particularly as all that dry matter accumulation has to stay in situ rather than remobilizing to the grain and it isn’t obvious that you wouldn’t take big hits in subsequent years anyway simply in terms of maintaining the root system for 5-6 months of the year (also how easy is it to have a plant still die back the way corn needs to while maintaining perenialism – I’ve touched vaguely on some of the genetics of senescence and to state that it’s a pretty easy few gene affair is kinda like suggesting that getting to the moon is just a simple case of understanding gravity)
On the business aspect – I agree with your assertions – if perennial row crops were developed and were succesful then I see no reason that big agribusiness wouldn’t find a way to make a buck off them – assuming you still need rootworm, earworm and weed protections you have an instant in for transgenic licensing – and frankly seed you sell once every 5 or so years is better for a company than seeds you never sell, so even just general seed would still be persued – farmers are still going to want to purchase the perennial which gives them the best yield and will probably be willing to pay a tad bit extra if it’s going to give them 5+ years so the lost profit could probably be made up that way also (maybe not – my obvious lack of business acumen is probably clear here to anyone who actually has any…)
I agree that when one studies life history theory and the trade-offs inherent in plant allocation that seed yield could take a hit.
However, what we should be concerned about is net energy and soil building. Our boundary of analysis needs to be much broader than how many bushels per acre did you get this year? Also looking at life history theory we’d expect perennials to do much better over the long-run.
Another advantage to perennials is that if breeding is done not just for seeds but also for forage quality you can stack functions. Graze during the winter and early spring, then pull livestock off just ahead of the bolting phase. This is already done on grass seed fields in Oregon and has kept the sheep industry alive even while New Zealand and Australian imports were historically cheap.
Grass fed meat and perennial grains look like a killer combination for anybody smart enough to invest in them.
Just a bit more background. I have spoken directly to some of the folks involved in this research and apparently there are two general tracts for development.
The first is to take existing cultivars and cross them with perennial relatives and then do selection.
The other is to domesticate the perennial relatives.
Marker assisted breeding is used and its expansion would really help speed this up. As would more plots and more researchers to do more trials, etc.
For most farmers however looking at net energy and soil building will be of no interest if yearly income takes a hit – hence the focus on yield – if energy inputs increase to a point where the difference is neutral to net positive for farmers then perhaps we’re talking (likewise with soil building – with prices of N set to increase this could well be a selling point)
I wonder how forage quality of corn is likely to differ – sans dry down you not only run into issues of reduced yields but also of increased costs getting your grain to that 15% (if I remember the number right) moisture – higher than this and you incur end of season costs running ovens to dry down the harvest (which is why it isn’t uncommon in wet years to see fields of corn unharvested through the winter) with dry down you’ve got honking great unpalatable stalk material all over the darn place.
Also corn ain’t too great at growing under cold conditions – I don’t forsee any capacity to grow much in the way of forage between traditional harvest and planting times – the changes required to the biology of the plant are probably significant enough that the smart money may simply opt for something like wheat which already has the genetics in place for winter and summer growth (albeit in different varieties) rather than starting from scratch with corn which has, over the years, been heavily perfected to the annual lifestyle – corn simply doesn’t appear to be a great candidate – particularly with other promising grains which could fulfil the task.
I think increased fuel and fertilizer costs will force more long-term thinking. Also depends on how much of the operation is debt financed. Less debt = greater long-term perspective, in general.
And I agree that the small grains are better candidates for this combination forage and seed harvest idea. I know that annual wheat is sometimes grazed in early winter if it comes up very quickly. Grazing often delays maturation, which can be a good thing if you are trying to harvest during the warmest month of the year.
Ewan touched on it, but I’d like to elaborate: Maize is tropical; even a perennial variety would have to be treated as an annual in Sackatchewan.
Tomatoes, peppers, potatoes and several other crops (even non-solanaceous ones!) are treated similarly.
Hardiness is a different concept for perennials and winter annuals vs. summer annuals. Breeding for a perennial crop from an annual one should start from something that survives the winter as a plant.
If anyone can do it with corn, it’s Ed Buckler, but I third the thought that small grains would be better. Why aren’t we doing more with sorghum, amaranth?
Does anyone know what approaches researchers are actually taking to do this? I could imagine a potentially useful near-term goal being (perennial) small grains that resprout after harvest (which is often early in the summer anyway) and use their late summer/fall growth to build root biomass or feed animals.
You wouldn’t need to re-invent hardiness or vernalization control if you used something like wheat, but I have no clue what the genetic architecture of a persistent crown/root system would be. Possibly just a simple meristem switch? I know there’s been a lot of talk about perennial wheat grass. Anyone have a clue what’s known about what makes grasses perennial or annual?
I’m gonna have to look into this myself if no one knows…
This is one of those instances where I would dearly love to be utterly utterly wrong. If only because knowledge of the how to get there would be awesomeness of the first degree.
Maybe it would be easy?
A quick search found this. I really need to resist reading it now but, I will later…
http://jxb.oxfordjournals.org/content/55/403/1715.short
“A single chromosome addition from Thinopyrum elongatum confers a polycarpic, perennial habit to annual wheat”
Matt,
I think a lot of “perenniality” has to do with resource allocation: there are many examples of plants with relatives that are the other way(Anethum and Foeniculum will cross, but one is annual, the other not); the difference seems to be whether the plant irrevocably switches to “all resources to the seeds” mode. There are several species with both perennial and annual races (E.g. Zea m.), and some can be manipulated to “perennialize” if seed formation is prevented the first season (Alcea, Meconopsis). Consider the larger question of being monocarpic, like some bamboos. (Of course, there ARE some species that are also “obligate annuals,” no matter how you grow them.)
I guess that narrows things down somewhat – but it doesn’t get past the major stumbling blocks –
What does it do to plant morphology (Wheat x Oats discussion recently highlighted that swapping whole chromosomes, while cool and all, gives some pretty poorly plants)?
What does the alteration do to end season yield (on top of other characteristics like grain composition – which for wheat particularly is very important and not to be messed with)
‘twould be interesting to see what radiation induced chromosome trickery (as per the wheat x oat) told us in terms of which bits of the chromosome induce the switch – you’d assume that with less chromosome you’d have less deleterious effects to have to breed out/around – plus armed with the knowledge of smaller number of genes involved you could attempt to transgenically manipulate other crops to do the same thing.
I don’t think that it’s necessarily a forgone conclusion that annuals will always outperform perennials. While an annual will put all of its energy into the grain the perennial will have more time to collect energy. The energy it holds back from the grain in fall isn’t massive, but it keeps the rest of the plant alive and collecting sunlight, and that collected sunlight is stored up and makes a showing in the spring when the plant comes back much faster and has much more surface area to collect energy to go into the seeds.
Interesting… I’m getting the impression that genetic control of perenniality itself isn’t necessarily complex. This sounds like just the type of problem that cheap sequencing of an appropriate population could knock out…
Though according to this paper from the Land Institute, there’s still a lot of mystery left in it:
“However, regrowth in wheat, like rhizome development in rice or sorghum, indicates the capacity to remain alive after maturity and harvest but does not in itself guarantee peren- niality. For example, the tissues of a perennial wheat plant that survives after summer harvest must also be able to remain in or return to a vegetative state (to avoid flowering out of sea- son, which could be fatal), maintain a robust root system, stay alive through hot or dry conditions during late summer, sur- vive freezing temperatures (if in the temperate zone), and then initiate reproductive growth at the appropriate time the fol- lowing spring (Lammer et al. 2004). These complex envi- ronmental responses are affected by many genes.”
Cox, TS, JD Glover, DR Van Tassel, CM Cox, and LR DeHaan. 2006. Prospects for developing perennial grain crops. BioScience 56:649-659.
Where perennials would be really useful would be in the terrible laterite soils of Africa. Perennial roots could reach the deep subsoil to get phosphate, potassium and the trace minerals that have leached from the laterite.
Caliche?
Seriously, one of the good ideas of “organic” gardening is the use of deep-rooted plants to retrieve deeply-buried nutrients for surface application (after either composting or animal “processing”).
As Jafar said: “The idea has merit.”
This could also work in places like Iraq where salinity of the subsoil is an issue. Prima-donna crops can’t get to the nutrients, but maybe a salt-tolerant, nonaccumulating deep-rooted (perennial) planting could be used to retrieve them….
What do you get when you have “deep-rooted plants to retrieve deeply-buried nutrients”? You “mine” the soil at a depth where you can’t replenish what has been taken. Not without some fancy doo-dads like injection wells, maybe — but I doubt that would return on investment. And there would be knee-jerk complaints about “poisoning the aquifer.”
I still like the idea, but none of these things is easy or simple.
3 meters is not aquifer depth, you can reintroduce nutrients to that dept. However in a healthy system there is less need to do so.
There have been suggestions to use perennial nitrogen fixing woody plants to “mine” the deep subsoil, the problem is that most of the biomass they produce is not edible. I have had the idea that if you could convert that woody biomass into edible biomass, that would be a great thing too. I have thought of mushroom culture, or worm culture. Earthworms have to be pretty good at converting plant biomass into worm biomass. They should be more efficient than ruminants because they are cold blooded.
All the trace minerals would be cycled through the worm biomass and be recovered when the worms are eaten by chickens.
Trace minerals in the deep soil aren’t doing anything except sitting there. Bringing them to the surface and incorporating them into food means that people can benefit from them. So long as they are not allowed to leach away, they are not lost.
daedalus2u:
Just some thoughts that came-up:
Conservation of Mass.
I have repeatedly received suggestions that, for my home greenhouse’s fertilizer needs, that I just ‘use the water from the aquariums: that’s full of nutrients.’ Leaving aside for now nitrification/denitrification, where, exactly, do those ‘nutrients’ come from? ‘From The Fish.’ Where do THEY get them? By some transmutation of elements from non-nutrient to nutrient?
No, they come from the fish food.
Likewise I’m told that to increase the soil nutrients, all I have to do is let animals graze and use their manure (and bones and offal). Where do the nutrients in the manure come from? From the plants the animals eat, plus the feed I bring-in.
In each case, I could skip a step: simply directly apply the fish food or compost the plants where they are and apply the feed as a fertilizer.
‘So there’s a larva in your fruit: don’t worry: it’s protein.’ I cringe at that. Unless the insect can fix nitrogen, the only nutrients it brings to the fruit are in the original egg.
Strictly-speaking, the above only apply to mineral nutrients: nitrogen is a bit more involved. Animal (me) nutrition is complicated by issues like amino acid balance and interconversion (fermentation [silage] can increase the nutritional quality of a feed by encouraging the interconversion of amino acids to a more salubrious balance). Likewise nitrogen fixation and return to the atmosphere is a complication.
The primary advantage of routing the nutrient source through animals is that the CALORIES (from proteins, carbohydrates and fats) can be used by heterotrophs (animals, including worms and fungi for the sake of conversation), but are immaterial for plants. Secondary benefits include acceleration of availability, grinding services, elimination of seeds and diseases, etc.
Goats offer a dandy grinding service to accelerate nutrient availability, and at least a fraction of the calories become tasty beg wat. Unfortunately, there is a bit of a problem repatriating the animal’s body nutrients (I eat some, and the bones and offal are problematic).
Worms and fungi aren’t as useful for grinding or delicious in stew.
See http://humanurehandbook.com/
A related issue that I have always wondered about is the energy (calorie) waste of composting: all the energy that could go to growing meat or running my car is wasted feeding bugs and making steam in a compost pile, or worse, in a big commercial composting facility like in many US cities, it is a Big Problem. If the compostables were diverted through a methane-generation system, wouldn’t some of that energy be recovered? Would the sterilization (provided by the waste heat in the normal method) still be accomplished? All the minerals are still there, obviously, but how much denitrification does the anaerobic “digestion” induce? Can that be limited? The residue of methane digestion should still be a useful source of (mineral) nutrients, as is ash, but also of lignaceous soil-texturizers.
Of course, burning also retains mineral nutrients, and removes all calories, but leaves no N or texture.
Pyrolysis is intermediate between composting/digesting and burning, and might be advantageous in certain environmantal conditions where low effective humic content of soil is a limiting factor on productivity, but it doesn’t look like it scales well: I cringe to see people carrying buckets of soil amandments: http://www.biochar.org/joomla/
Ehhh, bag it! I don’t have time right now to delve so deeply into the energy and nutrient economics of agriculture at different scales, differeent capitalizations and different environmantal boundary conditions. It sure looks like it would be amenable to a computational model though: does anybody know of one?
In your fish food example you are exactly right. Cycling the fish food through fish does not add any nutrients. Nitrate is a little different than ammonia, a combination of both works best as a nitrogen source for plants (usually), but that can cause denitrification.
A system that is fully anaerobic won’t do denitrification. Ammonia first has to be nitrified before it can be denitrified. Oxidizing ammonia to nitrite takes less O2 partial pressure than oxidizing nitrite to nitrate.
Composting biomass is a waste. The bacteria turn it into CO2. Making biochar is a much better thing to do with biomass than turn it into compost. There are some projects that make stoves for use in the undeveloped world, which gasify a portion of the waste biomass, the gas is burned to make heat for cooking and biochar is produced for carbon sequestration. The carbon sequestration credits can pay for the whole process. People in the undeveloped world get access to free fuel, they sequester carbon, and the stoves greatly reduce air pollution (smoke, soot and CO) and heat better.
http://www.vrac.iastate.edu/ethos/proceedings2011/Mulcahy_CarbonOffsets.pdf
In tropical laterite soils, humus doesn’t persist because of the oxidizing conditions. Biochar does persist and greatly improves the fertility of the soil. Mineral soil doesn’t have any anion exchange sites. It has cation exchange sites which is why ammonia can persist and doesn’t leach but nitrate does leach. The carbon in biochar does have anion adsorption sites as well as surface area for bacteria to live on.
http://www.css.cornell.edu/faculty/lehmann/research/terra%20preta/terrapretamain.html
If organic farmers turned their waste biomass into biochar, they would improve their soil more than using it to make compost. There is loss of nitrogen in making biochar (to some extent, not necessarily completely). There is loss of nitrogen in composting too.
daedalus2u:
We are much in agreement here, but I would caveat a couple of things (not having yet read all the lit): How to you keep people from burning the char(coal) for cooking? Charcoal is, after all, a traditional fuel, as evidenced by the massive deforestation caused by the demand for it. Also, if ion-exchange capacity and soil texture are not the limiting factor in a given circumstance, it’s not (as) effective (though I’d like to know the cation release rate re Ca, K, Mg, etc. that were present in the original source). If a lot of it is used, does it present a fire hazard like this: http://www.offroaders.com/album/centralia/centralia.htm? I’m thinking that it does not necessarily apply as well to arid-hot or not-hot as to wet-hot climates.
If carbohydrates and the like were removed from biomass in some non-wasteful manner, the remaining ash and lignins/humics would still be a good soil amendment in many places.
Hot composting and anaerobic CH4 digestion are not compatible with prior or simultaneous vermicomposting, but maybe subsequent worm treatment could extract the N in a way to protect it from loss. If the overall conversion efficiency from N-content in raw material to chicken (or whatever) were sufficiently high, it would probably be worth it. But N is relatively cheap and abundant from legumes or Herr Haber. K and P are more problematic.
Behavioural/social engineering may be of value: recycling ash, “Manural Rights” and the like. It’s the Tragedy of the Commons when trees, dung and plant roots have such great short-term value that they are destroyed rather than serving higher long-term purposes.
You keep people from burning charcoal by providing them with a superior alternative that is cheaper. That is what the stove projects are doing. Traditional charcoal production is very wasteful. It only works on large woody biomass and wastes most of the fuel value.
The process of converting waste biomass into a fuel that can be burned to generate heat and biochar can be paid for by carbon sequestration credits of the biochar produced.
To propagate a fire underground you need very high concentrations of combustible material, and you need a high degree of insulation so the combustion zone can remain hot enough. That won’t be a concern for a very long time, if ever.
Converting waste cellulose and carbohydrates into edible biomass is more “efficient” at food production than doing anything else with them.
Hmmm,
I have a superior alternative, but I still sometimes use charcoal to cook.
Commercial charcoal “briquettes” are about the same stuff as “Char”: they are made from miscellaneous vegetation scraps, not necessarily all woody, and other things like lignite, with lime added to give the special ash they burn into. There are other additives too.
Soil fires are no joke. Where I grew up, great swamps are maintained, prevented from filling-in through eutrophication, by fires in drought years that burn the peat out of the soil down to the water table. The amount of fuel doesn’t have to be very much: the mine dumps from coal mining can burn, and presumably, the miners didn’t choose to waste valuable coal. And then there are these:
https://www.dmr.nd.gov/ndgs/ndnotes/ndn13_h.htm
http://www.upi.com/Odd_News/2008/07/18/Potting_soil_fire_destroys_home/UPI-73621216422101/
http://punkrockgardens.com/2010/11/warm-up-to-biochar-for-better-soil/
The coal in ND is lignite. That can easily spontaneously ignite and is present in very thick seams (tens of meters). Biomass is easier to spontaneously ignite than is biochar. Oily rags are the classic fire hazard due to spontaneous combustion. I have no idea what that potting soil fire is about.
Flammability issues of soil are only going to occur at gigantic biochar loadings, many tens of percent. Ten percent biochar in the top 0.5 meter of soil would be ~100 kg/m2. That is 100,000 tons per km2. The trillion tons of carbon limit that humans are half way toward emitting would only take up 10,000,000 km2. That is 1/3 the area of Africa.
Biochar is not like charcoal briquettes. It is mostly fine particles and not big chunks. The heating value of the biomass is reduced by turning it into biochar. Any flammability problem of soil would be worse with incorporation of the biomass without turning it into biochar. Incorporating biomass into soil makes the soil hypoxic because soil bacteria use O2 to oxidize the biomass. Biochar is not oxidized by soil bacteria but its porosity increases soil O2 levels.
One of the major uses for heating in the undeveloped world is boiling water to reduce disease transmission. The stoves that use biomass to make biochar boil water a lot better than charcoal burning stoves do.
OK: I am aware neither of the necessary fuel content to support combustion underground/in soil (though it’s surprizingly low), nor how that compares to the concentrations achieved in “biochar” application. I still think it’s worth being aware of; though a field burning-up in the dry season itself would be a limited inconvenience, near- or on-site buildings and the like would be a problem.
The potting-soil thing was probably a consequence of a discarded cigarette, but establishes a datapoint of flammability: most commercial potting soils are ~50% organics, and we could certainly charactereize them in the lab (what an exciting project!) but it would be more direct to work on char.
I know one manufacturer uses “slash” from logging operations (sawdust, wood-chips, bark, twigs and needles (pine)) to make low-Q gas for boilers and the “charcoal” residue from the process is pressed into “briquettes.” Briquettes break-down into a fine-grained “mud” of charcoal when wetted. (Incidentally, I believe borax is one of the additives used in briquettes, along with lime, etc.: it would be inappropriate to blithely add them to soil because of possible Boron phytotox, just like coal ash.)
I still question the economics of char generation: Let’s say neighbor A has a superdeduper new bio-char stove, and is getting-along dandily, saving-up a nice pile of char to add to the field. Meanwhile, neighbor B does not have a bio-char stove, but continues to buy charcoal that is expensive because it comes from far away where there is still some vegetation, as do beighbors C through Z. They see neighbor A’s pile of charcoal; one reaches in his pocket, pulls-out money and says “Hey, A, I’ll give you this for your charcoal!” What does A do, and why?
neighbor B doesn’t have a b
Neighbor B through Z get stoves too, with free fuel if they bring the biochar back. The biochar these stoves produce is not nice big chunks of charcoal, it is like the fine grained mud that charcoal briquettes breakup into except they are already broken up.
If neighbor A sells her biochar, then she would have to go and buy fuel too. If she turns her biochar back in, she gets free fuel to replace it with.
Burning charcoal powder (what the stoves produce) isn’t going to be easy, or fun, or satisfactory.
I think borax is too expensive to use, I think they use clay and starch. Borax would also add boron to any food cooked over it which would be unacceptable.
http://www.virtualweberbullet.com/charcoal.html