Are we taking supercomputing code seriously?

Discuss life, the universe, and everything with other members of this site. Get to know your fellow polywell enthusiasts.

Moderators: tonybarry, MSimon

hanelyp
Posts: 2261
Joined: Fri Oct 26, 2007 8:50 pm

Are we taking supercomputing code seriously?

Post by hanelyp »

Are we taking supercomputing code seriously?
Someone remarked to me recently that the problem with scientific software is that most of it is written by amateurs. Harsh perhaps, but it got me thinking. The point behind the remark is that most of the software used for simulation in scientific research, especially on supercomputers, is written by scientists rather than by professional numerical software engineers.
Goes on to discuss shortcomings common in computer models used in science.

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

physicists invented computing. Its a bit ironic to say we're all amateurs now.
Carter

Aero
Posts: 1200
Joined: Mon Jun 30, 2008 4:36 am
Location: 92111

Post by Aero »

I learned FORTRAN in 1968. We had the same mind set about programmers back then. "IF they are going to program it right, they have to become engineers." Well, guess what, they did. By the same token, "IF they are going to program physics research right, they will have to become Research Physicists." I expect they probably will, I certainly wouldn't put it past them. But then, neither would I say that the programming skills of the physicists I've worked with are amateur level.
Aero

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

kcdodd wrote:physicists invented computing. Its a bit ironic to say we're all amateurs now.
I doubt most code produced by physicists could pass an FAA or FDA review.
Engineering is the art of making what you want from what you can get at a profit.

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

I am not sure what that even means. Why would the FDA review code?
Carter

Aero
Posts: 1200
Joined: Mon Jun 30, 2008 4:36 am
Location: 92111

Post by Aero »

MSimon wrote:
kcdodd wrote:physicists invented computing. Its a bit ironic to say we're all amateurs now.
I doubt most code produced by physicists could pass an FAA or FDA review.
I doubt most code produced by physicists entails the risk of life or death from errors, so I don't think the rigor of an FAA or FDA code certification is appropriate.

FDA would review code to insure that the statistical results of clinical trials are correctly calculated. The SAS software package is (or was) the only code approved for such calculations. The FAA reviews code because of similar life or death risks from errors, (air traffic control software).

http://en.wikipedia.org/wiki/SAS_%28software%29
Aero

kcdodd
Posts: 722
Joined: Tue Jun 03, 2008 3:36 am
Location: Austin, TX

Post by kcdodd »

It also assumes code produced by a software engineer could fair any better. I have rarely used any library, built by "software professionals", which was free of problems of usability, extensibility, and reliability. What you are really talking about is adding time and money to research. Some research warrants it, some does not. Some equipment and code is reusable and some is not. The level of abstraction he suggests may or may not increase flexibility, assuming such flexibility would even be useful, but it may also decrease efficiency, one thing which is almost always useful. Reliability is always an issue of any project of any reasonable size. Again, more time and money to get that. Test test test. Experimentalists are used to testing stuff, so whats the diff.
Carter

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

I don't use libraries. Unless forced to. I roll my own and then test random cases, corner cases and transition cases.
Engineering is the art of making what you want from what you can get at a profit.

MSimon
Posts: 14335
Joined: Mon Jul 16, 2007 7:37 pm
Location: Rockford, Illinois
Contact:

Post by MSimon »

kcdodd wrote:It also assumes code produced by a software engineer could fair any better. I have rarely used any library, built by "software professionals", which was free of problems of usability, extensibility, and reliability. What you are really talking about is adding time and money to research. Some research warrants it, some does not. Some equipment and code is reusable and some is not. The level of abstraction he suggests may or may not increase flexibility, assuming such flexibility would even be useful, but it may also decrease efficiency, one thing which is almost always useful. Reliability is always an issue of any project of any reasonable size. Again, more time and money to get that. Test test test. Experimentalists are used to testing stuff, so whats the diff.
Well OK. Test code vs production code.

So how does one explain the "Harry Read Me" file from Climategate?

Once you get sloppy it hardly ever improves. And the test code winds up encapsulated in the production suite. Quality code takes time. Why waste time when there are new discoveries to be made?

Now every bit of code I write may not go through the FAA grinder. But I like to write it all as if that was the intent.

Standards.
Engineering is the art of making what you want from what you can get at a profit.

TallDave
Posts: 3141
Joined: Wed Jul 25, 2007 7:12 pm
Contact:

Post by TallDave »

I doubt most code produced by physicists entails the risk of life or death from errors,
Au contraire. The GISS and GCM code is the basis for decisions that could save or cost millions of lives.
n*kBolt*Te = B**2/(2*mu0) and B^.25 loss scaling? Or not so much? Hopefully we'll know soon...

BenTC
Posts: 410
Joined: Tue Jun 09, 2009 4:54 am

Post by BenTC »

As much as scientific results are put up for peer review, ideally any code used to process experimental data being equally subject to peer review. Unfortunately than may expose more intellectual property than a paper would.

One of the big issues not mentioned in the article is understanding the domain of the problem. Many software systems are reportably never put into use because of a poor match between actual requirements and what developers understood. That is not just the fault of developers. Often the user does not initially understand their requirements. The term from Software Quality Assurance are Verification & Validation. Its not just writing the software right. Its also writing the right software.

The scientist is in a much better position that a software developer to understand the domain of the problem they are trying to solve. I imagine that for scientific coding, the code would evolve as understanding grew. A strongly prototyping environment. There may be a benefit in having an intimate knowledge of the code rather than using a black box written by someone else. Also the lifecycle delays introduced by someone else doing the coding might not be acceptable. Its MUCH easier to write software just for yourself than to make it robust enough for other people to use.

In my previous life in IT, one of the bane-of-existance scenarios was someone that had done a one-week course in Microsoft Access, and then quietly written a database application that was so useful everyone in their department started using it and it quietly became mission critical. Then suddenly it started hitting data integrity issues and hard limits in the software and became a hot potato we had to clean up. On the other hand, I woudl say the hardest part of software development is in properly eliciting the user's requirements. If the user's effort could be considered a protoype, after which professional developers could reimplement more robustly - then it might be that if the end user had been required to formally define all their requirements upfront, the applicationwould never had been written. It was Fred Brooks in The Mythical Man-Month said "Plan to throw one away; you will anyway".
In theory there is no difference between theory and practice, but in practice there is.

WizWom
Posts: 371
Joined: Fri May 07, 2010 1:00 pm
Location: St Joseph, MO
Contact:

Post by WizWom »

Well, I'll be working with MCNP-X pretty soon, and I'll have an answer as to the elegance of it.
Wandering Kernel of Happiness

AcesHigh
Posts: 655
Joined: Wed Mar 25, 2009 3:59 am

Post by AcesHigh »

Aero wrote:
MSimon wrote:
kcdodd wrote:physicists invented computing. Its a bit ironic to say we're all amateurs now.
I doubt most code produced by physicists could pass an FAA or FDA review.
I doubt most code produced by physicists entails the risk of life or death from errors, so I don't think the rigor of an FAA or FDA code certification is appropriate.
what about the risk of death of the entire universe due to a black hole being created by the LHC? ;)

Aero
Posts: 1200
Joined: Mon Jun 30, 2008 4:36 am
Location: 92111

Post by Aero »

AcesHigh wrote:
Aero wrote:
MSimon wrote: I doubt most code produced by physicists could pass an FAA or FDA review.
I doubt most code produced by physicists entails the risk of life or death from errors, so I don't think the rigor of an FAA or FDA code certification is appropriate.
what about the risk of death of the entire universe due to a black hole being created by the LHC? ;)
That is the exception that proves the rule.
Aero

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Post by DeltaV »

WizWom wrote:Well, I'll be working with MCNP-X pretty soon, and I'll have an answer as to the elegance of it.
https://mcnpx.lanl.gov/

See what you can do about shielding against those pesky gammas. Lighter is better.

Post Reply