Digital Research Infrastructure for the Arts and Humanities

http://www.dariah.eu/ What is DARIAH? DARIAH (Digital Research Infrastructure for the Arts and Humanities) is a project to support the digitisation of arts and humanities data across Europe. (((Strictly speaking, that should be “DRIAH.” Maybe history was dry enough already.))) DARIAH brings together researchers, information managers and ...

Simulations at the Petascale and Beyond for Fusion Energy Sciences

Fusion donut

Imagine harnessing the power of the sun within a magnetic bottle. Unlike hydrogen bombs, which are essentially uncontrolled fusion reactions, scientists for decades have been pursuing the peaceful challenge of safely harnessing fusion energy, a potentially efficient and environmentally attractive energy source. Progress in addressing this scientific grand challenge, suggested William Tang, the Director of the Fusion Simulation Program at the Princeton Plasma Physics Laboratory (PPPL) has benefited substantially from advances in super-computing. At the March 10 Lunch ‘n Learn, Tang noted that such capabilities continue to progress at a remarkable rate, from tera-to-petascale today, and to exascale in the near future.

If we can create the conditions for fusion to occur, says Tang, bringing deuterium and tritium together at very high temperatures, the reaction produces alpha particles, fast neutrons, and an energy multiplication of 450:1. It would then be possible to use that energy to heat the burning plasma in a self-sustaining reaction.

FusionEnergy.jpg

The Federal Government recognizes the importance of the effort, as evident, for example, in the Department of Energy document, “Facilities for the Future: A Twenty-Year Outlook.” Current Presidential Science Advisor John Holdren has commented that it is important to shrink the time scale for achieving fusion energy deployment by increasing appropriate investments in fusion research and development.

Tang pointed out that major progress achieved over the years in magnetic fusion research has led to ITER – a multi-billion dollar burning plasma experiment currently under construction in Cadarache, France. Seven governments (EU, Japan, US, China, Korea, Russia, and India) that represent over half of the world’s population are collaborating on this international effort led by the EU. Up to the present, laboratory experiments have produced 10 megawatts of power for approximately 1 second. The goal for ITER is to produce 500 million Watts of heat from fusion reactions for more than 400 seconds. A successful ITER experiment would demonstrate the scientific and technical feasibility of magnetic fusion energy.

Tang emphasized that the burning plasma experiment is a truly dramatic step forward in that the fusion fuel will be sustained at high temperature by the fusion reactions themselves. Worldwide experimental data and computational projections indicate that ITER can likely achieve its design performance. Indeed, notes Tang, temperatures in existing experiments have already exceeded what is needed for ITER.

Tang expressed the hope that American investments in Fusion Energy development will be able to keep pace with those of foreign countries and that it will be possible to deal effectively with political and associated financial constraints to achieve the kind of sustained support that the highly challenging research efforts will require. This will be essential for attracting, training, and assimilating bright young people that are needed to move the program forward.

ITER.jpg

The ITER effort will clearly require strong research and development efforts to harvest the scientific knowledge, which Tang pointed out entails a proper integration of advanced computation with experimental data acquisition and analysis together with fundamental plasma theory. Progress will be significantly aided by the accelerated development of computational tools and techniques to support the acquisition of the scientific understanding needed to develop predictive models which can prove superior to empirical extrapolations of experimental results. This provides the key motivation for the Fusion Simulation Program (FSP) – a new U.S. Department of Energy initiative supported by its Offices of Fusion Energy Science and Advanced Scientific Computing Research — that is currently in the program definition/planning phase.

Tang expects that the FSP will make unique contributions to the fusion program by addressing the integration challenges for multi-scale physics problems that are currently being mostly treated in isolation. The FSP approach will involve carrying out a rigorous and systematic validation program — that would enhance confidence in the reliability of the associated predictive models developed to improve our capabilities for reliable scenario modeling for ITER and for future devices.

Tang added that even more powerful super-computers at the “exascale” range and beyond will help meet the formidable future challenges of designing a demonstration fusion reactor (DEMO) after ITER. With ITER and leadership class computing being two of the most prominent current missions of the U.S. Department of Energy, whole device integrated modeling, which can achieve the highest possible physics fidelity, is a most worthy exascale-relevant project for producing a world-leading realistic predictive capability for fusion. This should prove to be of major benefit to U.S. strategic considerations for Energy, Ecological Sustainability, and Global Security.

WilliamMTang.jpgWilliam Tang is the Director of the Fusion Simulation Program at the Princeton Plasma Physics Laboratory (PPPL), the U. S. Department of Energy (DoE) national laboratory for fusion research. He is a Fellow of the American Physical Society, and on October 15, 2005, he received the Chinese Institute of Engineers-USA (CIE-USA) Distinguished Achievement Award. The CIE-USA, which is the oldest and most widely recognized Chinese-American Professional Society in North America, honored him “for his outstanding leadership in fusion research and contributions to fundamentals of plasma science.” He has been a Principal Research Physicist at PPPL and Lecturer with Rank & Title of Professor in the Department of Astrophysical Sciences since 1979, served as Head of the PPPL Theory Department from 1992 through 2004, and was the Chief Scientist at PPPL from 1997 until 2009. He also played a prominent national leadership role in the formulation and development of the DoE’s multi-disciplinary program in advanced scientific computing applications, SciDAC (Scientific Discovery through Advanced Computing). For the next two years he will be the PI (Principal Investigator) leading a national multi-disciplinary, multi-institutional team of plasma scientists, computer scientists, and applied mathematicians from 6 national laboratories, 2 private industry companies, and 9 universities to carry out the program definition and planning of DoE’s Fusion Simulation Program (FSP).

In research activities, Dr. Tang is internationally recognized for his leading role in developing the requisite mathematical formalism as well as the associated computational applications dealing with electromagnetic kinetic plasma behavior in complex geometries. He has over 200 publications – with more than 125 peer-reviewed papers in Science, Phys. Rev. Letters, Phys. Fluids/Plasmas, Nuclear Fusion, etc. and an “h-index” or “impact factor” of 42 on the Web of Science, including over 5300 total citations. He has guided the development and application of the most widely recognized codes for realistically simulating complex transport dynamics driven by microturbulence in plasmas and is currently the Principal Investigator of a multi-institutional DoE INCITE Project on “High Resolution Global Simulations of Plasma Microturbulence.” The INCITE (Innovative and Novel Computational Impact on Theory and Experiment) Program promotes cutting-edge research that can only be conducted with state-of-the-art super-computers. Prof. Tang has also been a key contributor to teaching and research training in Princeton University’s Department of Astrophysical Sciences for over 30 years and has supervised numerous successful Ph.D. students, who have gone on to highly productive scientific careers. Examples include recipients of the prestigious Presidential Early Career Award for Scientists and Engineers (PECASE) in 2000 and 2005.

A podcast and the presentation are available.

Abusing Reconciliation?

Datagraphics can be used to inject civility into public debates. Senate Republican leader Mitch McConnell (and others) has been quoted as saying that reconciliation has never been used to pass something like health care before. And people who are  pre-disposed to believe what the right says believe him without checking further. And people pre-disposed to disbelieve what the right says ignore him. And in the middle the debate is stuck in limbo, tempers rising on both sides but no new information is forthcoming.

Now they are suggesting they might use a device which has never been used the for this kind of major systemic reform. We know it would be — the only thing bipartisan about it would be the opposition to it, because a number of Democrats have said, “Don’t do this. This is not the way to go.” — Senator McConnell on FoxNews

Recently the Sunlight Foundation ran an infographic examining the past 20 years of Senate Reconciliation bills. At a glance you can see which bills had bipartsian support and which didn’t, and the list is relatively small (13 bills) so deeper inspection can be had relatively easily. However, what’s not immediately obvious is an indication of what Senator McConnell alleges, namely the “magnitude” of these bills. Most bills seem on their face to be simple budgetary adjustments. The “Balanced Budget Act of 1997″ had wide bipartsian support and a relatively simple title. However the “Jobs and Growth Tax Relief Reconciliation Act of 2001″ had hardly any bipartsian support, perhaps more tax cuts were added in reconciliation than in the original bill? If the graph could be altered to show some sort of significance factor then that would be an infographic!

This significance factor or magnitude could be quantified by providing some sort of comparison of the changes against the original bill. By way of example, there are well established computerized ways to compare documents and a significance factor could be calculated by comparing the size of these changes. Bills with bigger “change files” could show up with fatter lines in the graphic above. Of course this technique would have some problems.  It could give false positives if there were a lot of words to describe a relatively minor change and it could give false negatives if a massive portion of the original bill was removed (describing a removal is a fairly easy task, essentially “delete lines 1-1000″). But even a flawed mechanism in the hands of knowledgable people can be a useful tool as knowledgable people can quickly weed out the false positives, reinstate the false negatives and focus the viewers’ attention on the issues that matter.

Such a tool could be useful here if we were looking at hundreds of bills, but we’re only looking at 13. A responsible journalist would have prepared for an interview with Senator McConnell by digging into these 13 other bills and asked McConnell to choose which of them would take 2nd place behind health care for “significant bills passed through reconciliation”. McConnell could then use that as a spring board to describe how much more of a change health care is from that previous “high water mark” or the question could reveal how hollow McConnell’s talking point was. Instead, we, as consumers of news, get neither.

Digg Delicious Facebook Slashdot StumbleUpon Twitter AIM Gmail Google Bookmarks LinkedIn MySpace Reddit Technorati Favorites Yahoo Mail Share/Bookmark

The Last Digit of Pi

[This is a rough transcript of my TEDxNYED talk, delivered on March 6, 2010, in New York City at the Collegiate School. TEDxNYED was an all-day conference “examining the role of new media and technology in shaping the future of education.” For a meta-post about the experience of giving a TED(x) talk, please read “Academic Theater (Reflections on TED & TEDxNYED).” What I actually said and did at TEDxNYED deviated from this transcript; I engaged the audience directly a couple of times, once for fun and once to get their ideas about the subject. I’ll post the video when it’s available.]

I want to tell you a story about a forgotten realm of education and knowledge. It is a cautionary tale, a parable of what happens when the world changes, when tradition is challenged.

Until relatively recently in human history, pi was the much sought-after solution to what was long called the “rectification” or “quadrature” of the circle, fancy words more easily symbolized by the diagram in this slide. How can you transform that circle into the overlaid square? One side of the square would be one-quarter of pi if the diameter of the circle is 1.

Pi was a coveted number for thousands of years, imbued with magical properties. Generations of scholars pursued it doggedly, often considering it the be-all and end-all of geometry.

This is a different pi—pi as we moderns know it:

Well, not all of it, as I’m sure you know. It’s just the first 200 or so digits. The number stretches on forever. I hope you weren’t expecting me to reveal the actual last digit of pi. Because there isn’t one. Strange, no?

Pi wasn’t always this strange. The ancient Egyptians knew better, pegging the ratio of the circumference to the diameter of a circle at 4 over 3 to the 4th power. That’s considerably more definite, and thus much more sensible.

Archimedes knew better, homing in on the value of pi between a couple of very close fractions.

If you are a biblical literalist, pi would seem to be 3, since the Bible clearly describes 30 cubits as encompassing a circle of 10 cubit diameter.

And the solutions kept coming. From ancient mathematicians and philosophers, to medieval scholars, to the Renaissance and the Enlightenment. Everyone seemed capable of finding—with enough effort—the exact value for pi. Squaring the circle was an effort of genius in an ancient science perfectly described centuries ago by Euclid.

But something changed radically in the eighteenth century, just after that book on the right by Joubert de la Rue. A few mathematicians started to take more seriously the nagging feeling that pi didn’t have a perfect solution as a magical fraction. It might not have a last digit after all. This critical number at the center of mathematics might, in fact, be irrational. One mathematician began to reconceptualize pi.

And there he is: the dapper Swiss German mathematician Johann Heinrich Lambert:

He was the son of a tailor, obviously, and was mostly self-taught in mathematics. His brilliant work in the 1760s showed that π/4 could not be a rational number—you could never exactly figure out the value of one side of that square—and thus that pi too was irrational. After Lambert, math textbooks declared the matter solved.

That’s right, problem solved…

Except….circle-squaring kept on going. The world of mathematics had changed with the discoveries of the eighteenth century but somehow the message didn’t get through to many people. John Parker, on the left, came up with my personal favorite solution: pi is precisely 20612/6561. Some circle-squarers, like James Smith on the right, mocked Lambert’s proof as the work of a dilettante.

Things then got testy between the new mathematicians and those who clung to the prior vision of pi. The record of this warfare is as informative as it is humorous. In the 1860s and 70s, James Smith took on Augustus De Morgan, a math professor in London, in a series of short pamphlets, which were the Victorian equivalent of Twitter.

But unsurprisingly, the castigations of professors of mathematics didn’t stop the circle-squarers. Their solutions kept on coming, even in the face of criticism, even after pi had been shown to be transcendental, meaning it couldn’t even be the root of some other number or equation. My favorite book from the turn of the twentieth century had this subtitle on the cover: “The great problem which has baffled the greatest philosophers and the brightest minds of ancient and modern times has now been solved by a humble American citizen of the city of Brooklyn.”

Now, it’s easy to laugh at these misguided circle squarers, especially when they’re from Brooklyn. But if you read circle-squarers seriously, and stop to think about it, they are not so different from you or me. Even in our knowing times, we all persist in doing things that others have long since abandoned as absurd or passé.

History tells us that people are, alas, not very good at seeing the new, and instead are very good at maintaining the past at all costs. This is particularly true in education: Euclid’s Elements, written over 2,000 years ago, was still a standard math textbook well into the 19th century, despite major mathematical advances.

So it’s worth pausing to think about the last digit of pi. Why did so many continue to pursue pi as it was traditionally conceived, and why did they resist the new math?

Think for a moment about the distinction between the old and the new pi. The old was perfect, simple, ordered, divine; the new, seemingly imprecise, prosaic, chaotic, human. So the story of pi is the story, and the psychology, of what happens when the complex and new tries to overtake the simple and traditional.

It’s happening all around us in the digital age. We’re replacing what has been perceived as perfect and ordered with the seemingly imprecise and chaotic.

Look at what has happened, for instance, in the last decade with Wikipedia and the angst about the fate of the traditional Encyclopedia.

Or newspapers in the face of new forms of journalism, such as blogging. A former baseball statistician, Nate Silver of FiveThirtyEight.com, can brazenly decide to analyze elections and economy better than most newspapers? Yes indeed.

Now this audience, hip to the right side of these screens, may want to be as mean as Augustus De Morgan to those still on the left. We may want to leave modern circle-squarers behind, and undoubtedly some of them will be left behind. But for the majority who are unsettled and are caught between the old and the new, we need other methods to convince them and to change the status quo. History tells us it’s not enough to say that people are blind to the future. We have to show precisely what the weaknesses of the old are…

…and we have to show how the new works better than the old.

Knowing pi correctly to the 10th digit is enormously helpful when accurately predicting the movements of heavenly bodies; try using James Smith’s 3 1/8 when tracing the arc of a planet or moon. For some physics, knowing pi accurately to the 40th digit is critical.

Moreover, this modern pi may be strange, but its very strangeness opened up new avenues of research and thought that were just as intellectually challenging and rewarding as squaring the circle. The transcendental nature of pi led mathematicians to ponder infinite sequences of fractions and had an impact on chaos theory. In computer science, coming up with algorithms to reach a billion or trillion digits of pi as quickly as possible advanced the field. And, if you still want an unsolved problem to crack, see if you can figure out if pi is what is called a “normal number,” where the distribution of the digits 0-9 is uniform…

…or is there instead a preponderance of eights. Now that’s a tough problem, related to real issues in modern math. So there are still problems to be solved, more advanced problems. Math didn’t end with the end of the old pi—it just moved in new, more interesting directions.

But to get to that point, mathematicians had to show in a comprehensible way how the new pi created a new order.

using the JSON perl mod

I just thought I'd make a quick blog post on how to use the JSON perl mod. Why use JSON when we have XML, I'll leave that to Russ or Richard, but to make a long story short, easier object handling for the projected javascript driven DVLF.
So, the perl JSON module is actually very easy and nice to use, it will convert your perl data structure into JSON without a sweat.
Here's a quick example which I hope will be useful :

#!/usr/bin/perl
use strict;
use warnings;
use JSON;

my %hash;
foreach my $file (@list_of_files)  {
        open(FILE,"$file");
        my @list;
        while (<FILE>) {
                push(@array,$_);
        }
        %hash{$file} = [@array];    # store array reference in hash
}

my $obj = \%results; 
my $json = new JSON;
my $js = $json-&gt;encode($obj, {pretty =&gt; 1, indent =&gt; 2}); # convert Perl data structure to JSON representation
$output .= "$js\n\n";
print $output;

And done!

Which manuscripts should we digitise?

Bl_add_ms_05111_C6250-01_detail (Detail from the Golden Canon Tables, Add. MS 5111)

The obvious answer to this question is: all of them! We all want access to free digital resources, but creating them is tempered by a series of practical considerations. How can we best deliver digitised manuscripts to your desktops? One answer is to secure funding for independent digitisation projects with achievable goals. Such a series of projects has to be placed squarely within a vision and strategy. At the start of each one we have to ask ourselves: which manuscripts should we digitise next? For the first phase of the Greek Manuscripts Digitisation Project, we chose 250 manuscripts which offered a good range of all the different types and included some notable highlights of the collection. Before the Project, these manuscripts were among the least accessible since they had not been catalogued to modern standards. We are very grateful to the Stavros Niarchos Foundation for funding the first phase of the Greek Manuscripts Digitisation Project, supporting our vision, and making our work possible.

It is, however, crucial that we also engage you. Here’s how. Contact me to answer the following question: which particular Greek manuscripts held by the British Library would you like to see digitised and why? I cannot promise that your favourite manuscript will be in the next phase, but I can assure you that your feedback will inform our decision.

Juan Garcés