Archive for the ‘computer models’ Category

Earth: not so dim after all

Sunday, May 8th, 2005

NYTimes reports three new papers in Science that call attention to a major gap in how the climate system works. The papers report that the Earth is brightening, that is, more sunlight is reaching the earth’s surface and is not being reflected by clouds or volcanic dust or pollution. Scientists don’t exactly know what’s causing the brightening, or how this is affecting the rest of the climate.

The findings of Dr. Wild and his colleagues are based on data through 2001 from a network of ground-based sensors that directly measure the sunlight hitting the ground. But the sensors are not evenly distributed, with the greatest number in Europe, few in Africa and South America, and none covering the 70 percent of Earth’s surface that is water.

Dr. Pinker’s team analyzed satellite data from 1983 to 2001 that covered the globe. Its findings about brightening, which basically agree with Dr. Wild’s, rely on computer models to estimate how much sunlight reaches the surface.

Finally, a team led by Dr. Bruce A. Wielicki of NASA’s Langley Research Center in Virginia reports that measurements from the agency’s Aqua satellite show a slight decrease in the amount of light reflected off Earth since 2000, which corresponds to a brightening on the surface.

So whom does one believe? The models or the sensors or the images? And why isn’t the Earth dimming like the models have predicted? The public likely will view this conflicting data as simply more evidence that climate change is a hoax.

Christian first-person shooter

Sunday, May 1st, 2005

NYTimes writes on the drive to create a Christian gaming market.

The Rev. Ralph Bagley is on a very 21st-century sort of mission: introducing the word of God into what he calls the ”dark Satanic arena” of the video-game business. But he has an old-fashioned calling to back it up. ”I’ve always just loved video games,” he says. ”I was one of the guys playing Pong. When I became a Christian in 1992, I still wanted to play, but it was hard when the best-quality games out there were Doom, Quake — Satanic stuff, you know? Stuff that if I went to church on Sunday and came home and wanted to play a video game, I kind of felt a little bit guilty about it. I tried to find other games out there that were Christian, and there were none. Absolutely nothing. I’m the kind of guy that when I see something that’s not being done, I want to do it myself.”

Can Tim LeHay’s Left Behind series become the next successful console game?

Earth out of balance

Thursday, April 28th, 2005

From The Earth Institute at Columbia University:

Using satellites, data from buoys and computer models to study the Earth’s oceans, scientists have concluded that more energy is being absorbed from the Sun than is emitted back to space, throwing the Earth’s energy “out of balance” and warming the planet.

Scientists from the National Aeronautics and Space Administration (NASA) (Washington, D.C.), The Earth Institute at Columbia University (New York), and Lawrence Berkeley National Laboratory (California) have confirmed the energy imbalance by precisely measuring ocean heat content occurring over the past decade.

The study, which appears in this week’s Science Express, a feature of Science magazine, reveals that Earth’s current energy imbalance is large by standards of Earth’s history. The current imbalance is 0.85 watts per meter squared (W/m2) and will cause an additional warming of 0.6 degrees Celsius (1 degree Fahrenheit) by the end of this century. This is equal to a 1-watt light bulb shining over an area of one square meter or 10.76 square feet. Although seemingly small, this amount of heat affecting the entire world would make a significant impact. To put this number in perspective, an imbalance of 1 W/m2 maintained for the last 10,000 years is enough to melt ice equivalent to 1 kilometer (6/10ths of a mile) of sea level.

My First Footprint

Saturday, April 23rd, 2005

The Earth Day website has reminded me of the use of computer mediated communication in conveying the impacts of our consumption on the planet. So take the ecological footprint quiz today from Redefining Progress. Does it persuade you to change your consumption patterns?

Google adds satellite maps

Tuesday, April 5th, 2005

The Associated Press reports on a new Google feature that incorporates satellite maps.

Online search engine leader Google has unveiled a new feature that will enable its users to zoom in on homes and businesses using satellite images, an advance that may raise privacy concerns as well as intensify the competitive pressures on its rivals.

The actual site is here. I can see why people would be alarmed. You can zoom in really close.

technology for cheap

Monday, April 4th, 2005

Check our the $100 laptops MIT labs are going to mass produce for the developing world. Is this a good idea? Already in the first world, we’re thinking about slapping on technology tax, to account for environmental “costs”, but technology already depreciates quickly over time. It would help bridge the digital divide, and would make it more equitable, but this also means more waste…and what about destroying the essence of cultures, with technology that will proliferate across nations, or is it something lost, something gained? Will this also be decreasing our diversity? I still think there should be something said about the amount of waste this will generate…the article doesn’t mention anything about whether the technology is less hazardous, and suppose first world countries started offering laptops at $100 a piece? Consumption would increase, no doubt…

Online games run amok

Thursday, March 31st, 2005

A Guardian newspaper article that speaks to the sheer magnitude of the online gaming community in China and the degree to which it’s gotten out of control:

A spate of suicides, deaths by exhaustion and legal disputes about virtual possessions have been blamed on internet role-play games, which are estimated to have more than 40 million players in China.

The article highlights the story of one individual who’s facing the death penalty for killing a (real) person for stealing his (virtual) weapon.

Interoperability in sharing species data

Thursday, March 31st, 2005

The class is currently reading “The Green Internet”, a chapter in Conservation in the Internet Age, edited by James N. Levitt (2002, Island Press). It discusses the problem posed by the plethora of biodiversity data collected by museums and others that remains isolated in separate institutions:

After 300 years of species inventory, the biodiversity science community lacked the means–an information architecture and a set of common practices–for the discovery, retrieval, and integration of data. From one collection to the next–often within the same institution–underlying specimen data are heterogeneous and incompatible. The data are recorded and stored in thousands of idiosyncratic, independently developed information systems and are dispersed worldwide across academia, government agencies, conservation organizations, research institutions, and private museums (p. 146).

The solution is an architecture called Species Analyst that creates a standard for storing and sharing information, as well as an interface and tools for analyzing data. Species Analyst was developed by a consortium of biodiversity researchers and computer scientists at the University of Kansas’s Biodiversity Research Center and the Natural History Museum.

A report from the Cover Pages covers some of the technical details:

The Species Analyst relies heavily upon the fusion of the ANSI/NISO Z39.50 standard for information retrieval (ISO 23950) and XML. Z39.50 provides an excellent framework for distributed query and retrieval of information both within and across information domains. However, its use is restrictive because of the somewhat obscure nature of its implementation. All of the tools used by the Species Analyst transform Z39.50 result sets into an XML format that is convenient to process further, either for viewing or data extraction. This fusion of Z39.50 and XML brings standards based information retrieval to the desktop by extending the capabilities of existing tools that users are familiar with such as Microsoft’s Internet Explorer and Excel and ESRI’s ArcView.

The Inter-American Biodiversity Information Network (IABIN) has a slide show that demonstrates the structure and features of Species Analyst.

What’s interesting about the chapter is not its report on the technical challenges of broad system diffusion, which are considerable, but its discussion of the social barriers to interoperability. First, the article points out that “too many museums have not grasped the first principle of the information age–namely, that access to their authoritative biotic information for knowledge creation and decision making is as valuable as the information itself” (p. 155). What the authors do not acknowledge is that transforming data into a format compatible with the information age (e.g., using the Darwin code standards) takes a lot of time and resources. Who in academia and elsewhere has the time to adapt their datasets to a particular standard and what’s in it for them? This is not a cynical review of university practices but a pragmatic reflection on the paradigm in which academics operate. The focus is on doing what’s necessary to get published and therefore advance in one’s career. Fail to complywith the paradigm and you get fired/aren’t promoted. This paradigm fails to recognize the prosaic needs of academia to broadly diffuse its source data after the articles are published. Not recognizing the prosaic needs means not giving out grants to do it or acknowledging the effort when promotion time comes around.

Second, the authors indicate that many institutions have policies that discourage and even prohibit sharing of biodiversity data. The authors don’t mention that many of these policies protect the intellectual property of the individuals as well as the intellectual capital of the institutions. Institutions may be governed by liability concerns over potential misuse of the data or copyright laws over which they have no control. For example, Canada operates under Crown Copyright Law (e.g., all the benefits of government activity must financially benefit the Queen), which renders nearly impossible sharing of spatial data by government agencies.

Third, the authors report that the successful integration and publication of all of the species collections will convince decision makers in institutions and government that sufficient amounts of data already have been collected to analyze biodiversity. Therefore, no further funds are necessary. This is, of course, the irony of developing a system such as Species Analyst, which has as its raison d’etre the idea that if only we could integrate all the species information out there, we could conduct phenomenal analyses of the world’s biodiversity. Why collect any more data or why not wait until the analyses are done before we collect more data? Promoting the system for broad diffusion inevitably undercut the need for further basic data collection. This speaks to the low regard in which basic research is held, on both the left (“Who needs basic research on an insignificant species such as snail darters when there’s so much poverty in the world?”) and the right (“Basic research on an insignificant species such as snail darters impeded economic development, which is more important to the well-being of individuals”). It also speaks to the myth of technology that it can automatically create knowledge out of data.

Last, the authors mention that lots of data is still not associated with technology nor with geography. For example, what do you do about the legions of archival data that exists in museums? Who’s paid to adapt collections data that can stretch back to the 1800’s? Also, most data doesn’t have locational data (location is a prime method used to integrate data in Species Analyst) or has vague spatial data (e.g., a species may be found along a river reach instead of at a specific point). I’ve discovered instances in which the geographic data collected by biologists is irrelevant to their studies. The lat/long point at which data is collected really represents an entire region (even though the actual point has been GPSd) or represents an ideal landscape in which species are modeled. Datasets which contain abundant temporal and species diversity may be represented by one data point.

I don’t want to detract from the research achievement of Species Analyst. Many people propose architectures to increase interoperability for biodiversity data but few engage in the technical difficulties of actual implementation. Still, interoperability can be limited more by social hurdles than by technical obstacles.

Climate Change arguments from the Left

Tuesday, March 22nd, 2005

Do some leftist social science critiques so damage the planet that we shouldn’t engage in them?

Here is an example from the 2001 issue of The Annals for the Association of American Geographers in which David Demeritt argues that we should fully explore the social construction of climate change models. Social constructionists argue that politics, culture, and power influence the building and application of such things as Global Circulation Models (GCMs). Power can derive from politicians eager to exploit uncertainty over science as a cover to advance corporate agendas or from academics eager to promote their discipline as the sole source of prediction on climate.

The author does provide mea culpas on the use of leftist critiques by the right as a way to undercut the entire climate change project. However, I think that he vastly understates the ammunition that his own efforts lend to opponents of climate change.

Organize your brain just as easily as you organize your computer

Monday, March 21st, 2005

This NYTimes article describes software that will help us organize our thoughts and ideas. After all, are not our brains no more than a set of file folders or a road map? (I wish it was but then I’m reminded of the state of my office and my inability to read maps.)

The article also explains the origin of this collection of software:

Both programs grew from the “Mind Mapping” movement, which is more famous in Britain and other parts of Europe than in North America, and whose origins are usually attributed to Tony Buzan. Beginning in Britain in the 1960’s, Mr. Buzan popularized the idea that to learn new topics, organize thoughts and become creative, people should draw “mind maps” on big sheets of paper, ideally with crayons or pens of many different colors. Mr. Buzan’s theories, including his 10 strict “laws” for drawing such maps, are available in his many books and seminars and at his Web site.

Who would have thought that crayons would improve university lectures? 😉

future industry

Tuesday, March 1st, 2005

I’m going to try and write a few more blogs this week to compensate for the ones I missed before the break. It has been so long I even forgot how to login! Someone mentioned something about 3D printers in class today and it reminded me of a slashdot blog I read earlier regarding a printer that makes sushi. Is there anything technology won’t be able to do in the future? Apparently the ink that is used is food-based and the paper is made of soy and seaweed. Can you see 3D printers being as common as personal computers in the future? Would they allow for customization of goods? Think of it – you wouldn’t have to step into another mall again – you could make your own custom barbie dolls as a gift for your little cousins, sitting at home!

The End of The Universe

Monday, February 28th, 2005

Now, the end of the universe is one of those things we could likely avoid thinking about, without too many consequences. However, it is sometimes interesting to consider what might happen a few billion years from now, will the universe start to contract, and eventually squish itself back into a little point before exploding again into the big bang? Or will it just continue expanding forever, until the suns go out, and all is cold.

In either of those situations, you have to wonder what will happen to humanity, if we perchance happen to be there, or to some other intelligent life, if any exists and makes it that far. Is that just the end of all life? It’s somewhat disconcerting to think so.

A fellow named Frank Tipler believes that it’s inevitable the universe will fill up with intelligent life, and that by the time the universe collapses, the ability of life to process information will be asymptotically infinite (as we get closer to the collapse, the closer to being able to process an infinite amount of information), allowing that life to essentially simulate the entire universe again, thus reproducing all of us, and our lives again, in this simulation of what is essentially the universe.

It’s an interesting idea, although I’m certainly skeptical that he bases his theological statements entirely on the bible. It’s interesting to wonder what the world will be like in a few thousand years, let alone a few million or billion.

You can read a little bit about his theory here (Tipler’s site) and here (wikipedia).

Along the same lines, there’s a fun short story by Isaac Asimov that has another ending to the universe:
The Last Question

Artificial intelligence and the environment

Sunday, February 20th, 2005

The NYTimes today has an article on the future of war fought with artificial intelligence, “a 21st-century fighting force of automated tanks, helicopters and planes, remote missile launchers and even troops of robot soldiers – all coordinated by a self-configuring network of satellites, sensors and supercomputers.”

This is little different from the myriad computer models and satellite photos with which we make life-and-death decisions about the environment. Protect this habitat here. Put that dam in there. The NYTimes article reminds us of the conundrum inherent in leaving everything to the computer:

The whole point of automation is to rise above human fallibility – knee-jerk decisions, misunderstood orders, cowardly retreats. Machines are faster, more focused, impermeable to propaganda and, at least for now, they don’t talk back.

As the thinking machinery continues to evolve, the strategists will keep asking themselves the same question: Is there still a good reason to trust ourselves or should we defer to a computer’s calculations?

In other words, is it better to model the environment or better to walk in the woods? Base judgements on our spiritual connections to nature or let the computer determine the optimal solution? If nature is socially constructed, that is an individual’s mental vision built up of past experiences, biases and culture, then whom/what should we believe? Would a computer be less biased?

Reflecting on Java

Thursday, February 17th, 2005

J-S’s proposal to research the social and environmental impacts of the Java architecture prompted me to see if there was anything on the social construction of Java, which led me to this. Enjoy.

Science, computer models and politics

Monday, February 14th, 2005

In politics all it requires is a little tweak of the computer model, perhaps a change in the units of analysis, and you get the result that the politicians want.

ECOLOGL@LISTSERV.UMD.EDU
From: Ecological Society of America: grants, jobs, news
Subject: Survey: political intervention in science pervasive at USFWS

Hello everyone,
The Union of Concerned Scientists (UCS) held a press conference to announce the disturbing results of a survey of U.S. Fish and Wildlife Service field scientists: political intervention to alter scientific findings has become pervasive within the agency. At field offices around the country, USFWS scientists tell of being asked to change scientific information, remove scientific facts or come to conclusions that are not supported by the science. As a result, the scientists say, endangered and threatened wildlife are not being protected as intended by the Endangered Species Act.

Despite agency directives to scientists not to reply to the survey even on their own time nearly 30% of the scientists responded. You can find a summary of the survey, its methodology, and a summary of results broken down by region here or by clicking here.

RESULTS SUMMARY
The survey paints a vivid picture of the systemic abuse of science and the need for change. Results show that:

Large numbers of agency scientists reported political interference in scientific determinations. Nearly half of all respondents whose work is related to endangered species (44%) report that they have been directed for nonscientific reasons to refrain from making findings that protect species. One in five have been instructed to compromise their scientific integrity, reporting that they have been “directed to inappropriately exclude or alter technical information from a USFWS scientific document.” In the Southwest region, that number was even higher -closer to one in three.

Agency scientists reported being afraid to speak frankly about issues and felt constrained in their role as scientists. 42% said they could not publicly express “concerns about the biological needs of species and habitats without fear of retaliation,” while 30% were afraid to do so even within the agency. A third felt they are not allowed to do their jobs as scientists.

There has been a significant strain on staff morale. Half of all scientists reported that morale is poor to extremely poor; only 12% believed morale to be good or excellent. And 64% did not feel the agency is moving in the right direction.

Political intrusion has undermined the USFWS’s ability to fulfill its mission of protecting wildlife from extinction. Three out of four staff scientists felt that the USFWS is not “acting effectively to maintain or enhance species and their habitats.”

In one of numerous essays submitted on the topic of improving scientific integrity at USFWS, one biologist wrote: “We are not allowed to be honest and forthright…I have 20 years of federal service in this and this is the worst it has ever been.” Another scientist reported that Department of Interior officials “have forced upper-level managers to say things that are incorrect.” A manager wrote: “There is a culture of fear of retaliation in mid-management.”

Encouragingly, it is clear from the survey that USFWS scientists are committed to and proud of their work and believe in the potential of the agency to conserve, protect and enhance fish, wildlife and plants and their habitats. However, political intervention is having a chilling effect on the ability of USFWS scientists to carry out the agency’s mission.

UCS has joined with Public Employees for Environmental Responsibility (PEER) to design and conduct surveys of several government agencies to document the abuse of science and determine the pervasiveness off the problem. The surveys will assist the scientific community in documenting that the abuse of science is an ongoing, serious concern. We are looking into ways that the results of the USFWS survey can be used to further a more thorough investigation of this problem.

It has taken decades to build worldclass scientific staff at the USFWS and other government science agencies. The future ability of the agency to fulfill its mission will be severely hampered if this political interference is allowed to continue. To restore scientific integrity at the USFWS, at least two reforms are needed: there must be protections for scientists who are asked to take actions that violate their scientific integrity and the Bush administration must recognize at its highest levels that manipulating or suppressing science for political reasons is unethical.

signed: Michael Halpern, Outreach Coordinator Restoring Scientific Integrity in Federal Policy Making Campaign Union of Concerned Scientists, Dean A. Hendrickson, Ph.D. Curator of Ichthyology, University of Texas, Texas Memorial Museum, Texas Natural History Collections, others.

See also last week’s LA Times, U.S. Scientists Say They Are Told to Alter Findings

Distributed computing project has bad news for climate change

Wednesday, February 9th, 2005

A recent report in Nature covers the first findings from the distributed computing project, climateprediction.net. The report finds that global temperatures could rise by up to 11C. This is two times the amount of other studies.

The project is remarkable because it utilized distributed computing. Distributed computing spreads out the computer processing and analysis among multiple CPUs (the chips in your computer that do the number crunching). Often these chips are located in separate computers. What’s even cooler is that the distributed computer can be your home computer. Instead of using an incredibly fast computer with multiple processors or a supercomputer, you download the software to your own PC, which runs the program while your computer is idle. The application generally shows up as a nice screensaver.

Distributed computing was first used by the SETI@Home (SETI stands for the Search for Extraterrestrial Life) project, in which separate PCs sift through packets of radio-telescope data, looking for evidence of extraterrestrial communication. It has since moved on to analyzing data for pulsars and studying protein-related diseases.

An excellent synopsis of the climate change project can be found at the BBC. They report that “More than 95,000 people have registered [to download the software], from more than 150 countries; their PCs have between them run more than 60,000 simulations of future climate. Each PC runs a slightly different computer simulation examining what happens to the global climate if levels of carbon dioxide in the atmosphere double from pre-industrial levels – which may happen by the middle of the century. ”

Of course, the accuracy of any model is just as dependent on its underlying assumptions as its computational power. So it will be a while before we can assess the accuracy of the dramatic temperature rise predicted by the project. However, distributed computing holds great promise for climate change, especially as the models increase in complexity and are reaching the upper limit of what can be feasibly and affordably accomplished in standard computing frameworks.

(An irreverent note. To date, distributed computing has been devoted to very serious projects. However, I’m waiting for the first silly project. How about a Shakespeare project that simulates millions of monkeys? I’d download the software.)