Whether you're new to climate topics or an expert you are most welcome. Before you can comment you'll need to register or sign in. Click one of the buttons below.
ON SATURDAY MORNING, the white stone buildings on UC Berkeley’s campus radiated with unfiltered sunshine. The sky was blue, the campanile was chiming. But instead of enjoying the beautiful day, 200 adults had willingly sardined themselves into a fluorescent-lit room in the bowels of Doe Library to rescue federal climate data.
Like similar groups across the country—in more than 20 cities—they believe that the Trump administration might want to disappear this data down a memory hole. So these hackers, scientists, and students are collecting it to save outside government servers.
But now they’re going even further. Groups like DataRefugeand the Environmental Data and Governance Initiative, which organized the Berkeley hackathon to collect data from NASA’s earth sciences programs and the Department of Energy, are doing more than archiving. Diehard coders are building robust systems to monitor ongoing changes to government websites. And they’re keeping track of what’s already been removed—because yes, the pruning has already begun.
Later that afternoon, two dozen or so of the most advanced software builders gathered around whiteboards, sketching out tools they’ll need. They worked out filters to separate mundane updates from major shake-ups, and explored blockchain-like systems to build auditable ledgers of alterations. Basically it’s an issue of what engineers call version control—how do you know if something has changed? How do you know if you have the latest? How do you keep track of the old stuff?
Whether you're new to climate topics or an expert you are most welcome. Before you can comment you'll need to register or sign in. Click one of the buttons below.
Comments