background image

background image

Description

The Industrial Revolution pushed civilization forward dramatically. The technological innovations achieved allowed us to build bigger cities, get richer and construct a standard of life never before seen and hardly imagined. Subsequent political agendas and technological innovations have pushed civilization up above Nature resulting in a disconnect. The environmental consequences though are leaving the Earth moribund. In this blog, I'm exploring the idea that integrating computational technology into environmental systems will be the answer to the aftermath of industry.

Above drawing is by Phung Hieu Minh Van, a student at the Architectural Association.

Tuesday, 15 October 2013

The eyes and ears of corn




This is a plot of tropical storms. Every tropical storm that occurred in the Northern Hemisphere between 2008 and 2012. On the far left is continental Europe (you can easily make out Italy), in the middle is the Pacific Ocean and on the far right you can make out Greenland, Iceland and the UK. Instead of plotting the storms as tracks as is normally done I've plotted them as points to show the density of storm activity. For some storms its possible to easily make out the tracks as the points line up in clear paths where there aren't many other points, for example over Canada. The different colours show the basins in which the storms started out in, for example red represents all the storms with their genesis in the North Atlantic. Its simple enough to understand this picture however the amount of organization, observation and data processing required to get this data is quite amazing. This post is going to be about human monitoring of the environment and our attempts to model its processes.

Keeping a record of things happening is very useful. This was recognised very early on in human history. Some of the earliest forms of the written word were pictorial representations of animal stocks and plant locations. Unfortunately, technology for a long time did not afford us the opportunity to keep records without huge vulnerabilities. Early archives were kept by the ancient Chinese, Greek, and Roman civilizations but little of these survive today. Thanks to good data collection, archiving, the emergence of computational technology, hard-drives and the application of statistics (in particular the methods of Thomas Bayes) we are now harnessing the huge power of stored information.

Monitoring the natural world has lots of advantages. Doing so allows us to plan for the future, deal with problems as they arise, tune performance, track expectations and study the environment. Collecting all this information in archives makes it even more powerful. The difficulty though is not so much the actual monitoring process but is rather knowing what should be monitored. Planning, implementing and running monitoring systems is expensive and collecting data that is superfluous also results in unnecessary costs and is unwieldy. So, there is an important question of what to monitor and how often it needs to be recorded. 

Physical environmental variables are fairly simple to pin down. We need to record temperatures, pressures, humidities and the like if we want to know how the weather works and interacts with nature and civilization. Other variables are not so easy to pick out. Generally it is the case that a research field or commercial application will require data on a variable, for example information on forest fires, and then collect it themselves or commission some-one to collect it for them. Not the other way around.

It is also the case that what we can measure and what we want to measure is different. For example, satellites orbiting the world can measure all sorts of things one of which is photosynthesis in plants. But there is nothing actually being sent out by the process of photosynthesis that is directly seen by the sensors on the satellite. So we can develop methods to use information that is picked up by the sensors (i.e. photons of electromagnetic radiation) orbiting Earth to infer information that is useful for other purposes (e.g. using photosynthesis rates for ecological research). 

Broadly, environmental monitoring programs can be split into 2 categories: remotely sensed data and in-situ data collection.
  
Remote sensing is 'the science, technology and art of obtaining information about objects or phenomena from a distance (i.e. without being in physical contact with them)' (definition from Dr. Mat Disney). It is an incredibly useful method of observing the Earth (and elsewhere too). The technology of remote sensing has its roots over 150 years ago with the invention of the camera. The first person to take an areal photograph was a Frenchman called Gaspard-Felix "Nadar" Tournachon. He went up in hot-air balloons tied to buildings to take photos of houses. For a long time, information garnered from remotely sensed images was limited to qualitative information only. As digital sensors improved and satellites were launched concrete, quantitative information could be derived from these images. 

Modern remotely sensed images can give us information on meteorological, geo-physical, biological, social, chemical and political variables that can be used for all sorts of things. For example, information collected by the American National Oceanic and Atmospheric Administration's (NOAA's) satellites is used in operational weather forecasts and scientific research globally. I've produced this animation of global temperature maximums for every 6 hours in the year of my birth 1992. The data is from NOAA's 20th century reanalysis.





On the ground however we can build up a different picture. Electronic, automated data loggers can record information for a range of variables from environmental variables such as temperature and pressure to social variables like road traffic and noise levels. These loggers can be integrated into wireless networks pretty simply and information recorded can be stored and transmitted around the world (Akyildiz et al. 2007). For example, there are networks of tethered buoys in the ocean that transmit information on swell heights. This information is used for all sorts of things for example tsunami monitoring to other more trivial pursuits.

Despite these monumental opportunities for data collection that technology affords us, constraints remain. Some times people actually need to write things down.

For example automating the monitoring of fauna remains difficult. Individual animals can be fitted with trackers but observing whole populations this way currently remains unfeasible.  Keeping records of vertebrate populations is vital for all sorts of reasons not least for agriculture and environmental management. However, population dynamics are understood to occur at large spatial and temporal scales. Collecting a large volume of data in as close as possible to real-time is a difficult problem. Compromises will inevitably be made but we would like to minimize them. Traditionally the alternative has been using a team of highly educated and skilled researchers to decide on a subset of the population thought be representative of the population at large and to periodically visit the area for data collection. This is fieldwork. We can do better.

Luzar et al. (2011) discuss large-scale, long-term environmental monitoring by non-researchers. They describe a 3-year study of socioeconomic factors, hunting behaviour, and wildlife populations in a 48,000 square-kilometre (about 2.5 times the size of Wales), predominately indigenous region of Amazonia. They found that despite high rates of illiteracy, innumeracy and unfamiliarity with the scientific method, provided the potential of the data was explained and had a clear purpose to those involved, people are likely to engage long-term and collect high-quality data.

Expectations are changing. The reality is that engineers, scientists and managers (including politicians) expect statistically robust and up-to-date data informing their decisions. They are expecting more and more high-quality data. So, we are transitioning from a time when patchy, out-of-date information was put up with as a fact of life to one where it will not be. So, the designers and implementers of environmental monitoring schemes are going from a time when it was preferable to have high-quality data to one where is it a requirement.

So what's the point of all this. Well, today we live in an era when we can not only collect all sorts of information about the natural world and how we are interacting with it (and on it). We can also store all this information, query it and use it. This is really a unique time in human history. 150 years ago the first photo was taken, 100 years ago magnetic storage (the precursor to modern hard-drives) was first properly developed, and in 2001 Google earth went live for the first time bring people satellite data from around the world. At the end of 2014 it might be possible that Google earth is in near real time. Having this data should change the fundamental mechanisms on which society works. We can now continually test ideas, fix problems earlier and track our progress better. This data should be looked after properly. It is the lifeblood of the integrated, physical engineering structures that I will explore latter in this blog.

What I'm really keen to get across in this post is that we now have a phenomenal ability of observe reality. One that has never existed before for civilization. Existing monitoring schemes need to be looked after and new ones built. The data collected should be accessible to all those that might need to without being wrapped up in bureaucracy and politics. Society should also make sure its is able to not only use the data to draw effective conclusions but also be able to enact on what the data is telling us.  

All sorts of constraints exist though. Implementing data collection schemes is costly. The cheapest satellites belonging to NASA or ESA are constructed for around 80 to 150 million pounds. Employing people too is expensive. Data collection and monitoring is also wrapped up in ethical questions of accessibility (should this data be private or public?) and ethics (Is our experience of life on earth detracted from by knowing what is happening everywhere at all times?).  I hope to think about some of these issues in another post.

In my next post will be 'fleshing out' collected data. Even in the most thorough data collection schemes gaps will remain or data will need to be of higher resolution than originally collected for. For this, a mathematical process is used called data assimilation. It involves looking for trends in the data and spotting patterns. Then using these to infer what is happening where data wasn't collected.

**********
Please engage with my writing and leave comments/criticisms/suggestions etc below.

2 comments:

  1. This comment was just emailed to me by Dr. Mat Disney (who is mentioned in the post):

    Hi Josh - I like it, interesting post.

    I'd make the point that data collection and storage is incredibly important (and you may not know why or what for at the time) - maybe go back further in history and look at some of the records that were made that enabled us to answer big questions eg Tyco Brahe's planetary observation that enabled Kepler to formulate laws of planetary orbit. Or Darwin's painstaking observations of finch characteristics. And more recently phenology observations made of plant flowering and fruiting dates (see Fitter and Fitter 2002 for eg http://www.sciencemag.org/content/296/5573/1689.short). Or Keeling's Mauna Loa observations of CO2 - one of the most important of all in the last 50 years. A key question which you allude to but could maybe follow is how on earth can we make eg satellite data we've collected over the last 30 years (say) robust and readable by humans in 100 years, never mind 1000? We can read things the Ancient Egyptians wrote; will we be able to read temperature records made digitally in 100 years?

    Cheers,

    Mat

    ReplyDelete