General Programming Practices

I need more terminology to talk to others about a Fortran program. What other phrases are there than "Tougher than all get-out" and "A fine construct".

Since we're at a University, it's essential that you work the work "paradigm" in there somewhere. How about "optimal paradigm for implementation of the algorithm".

Do we need to prompt the user for input on our programs?

Always! In this class, any programmed "read" from the terminal must be preceded by writing some intelligible message asking for input.

Why do you put so many lines of empty space in your programs?

I hope the lines aren't totally empty. They should contain a "c" in column one. These "blank" lines are just to make the comments stand out from Fortran code lines or to highlight key blocks of a program.

What can I do if my lines wrap around to the next line?

You have to get a feel for the location of the 72nd character position on the screen, or do a lot of counting. Once you hit column 72, you must hit the RETURN key, put some character (I like & or #) in column 6 of the next line then pick up typing where you left off. I usually stop before column 72 at some natural break point between variables:

      IF ( (X.LT.2.0.AND.Y.GT.30.0).OR.(X.GT.1000.0.AND.Y.LT.-40.))
Since Fortran tends to ignore blanks, the extra ones in the above 2 lines don't cause problems.

Do you have to count spaces or is there a command to show what space you are in?

I don't know of any way to do this in vi, other than count spaces. You can include comment lines in the program that act as space counters:

c 1 2 3 4 5 6 7 c23456789012345678901234567890123456789012345678901234567890123456789012 Is it possible to print our Fortran code for reference during debug?

You can bring a copy to your own PC with FTP and print. Sometimes the "print Screen" button works on your PC, but Windows usually messes that one up. You can also copy the listing from your Telnet window using the Edit menu, and Paste it into a Word Processor window for printing. If you are in 316 Hammond, and want a copy of the file "hw4.f", type:

lpr -Pascii hw4.f

How do we know where various steps go in a Fortran program?

Some commands have special locations, but most are located by the needs of the specific program. The PROGRAM card is always first. Statements giving variable types (INTEGER, REAL, LOGICAL, CHARACTER, ...) should precede "executable" statements. The END card must always be at the end of the program unit.

When accessing a data file in a program can I change directories?

Yes if you have a subdirectory called "test" under the location that your program, you can open the file "" in "test" for reading on unit 11 with the command: OPEN(11,file='test/')

How does dbx work?

I've never looked at the details of dbx. Debuggers have a couple ways of doing what they do. In the simplest implementation dbx is acting as a conduit for the instructions in your program. In this mode dbx is pulling instructions or blocks of instructions out of your executable file "a.out", and passing them to the CPU (of course the CPU is running dbx too, but sorting such things out is the operating system's problem). When you ask for a breakpoint at line 100, dbx looks in the symbol table to find the precise instruction in "a.out" that corresponds to the beginning of that line. When you type "run" or "cont", it then passes all program instructions up to that one at the beginning of line 100 on to the CPU, before prompting you for more dbx commands. If you ask for the value of "x" using "print x", dbx looks through the symbol table for the memory address of x and the type of x (real or integer), grabs the value from memory, and prints it in the appropriate format.

Can you explain exactly what dbx will do for me?

Exactly? Probably not. As I've said in class it will let you follow the flow of a sample program and watch values change on all variables so that you will understand how the Fortran (or C) statements are operating. You may not fully appreciate the more important aspect of dbx until you start writing much longer programs than we attempt in this class. An important aspect of any programming effort is generating an independent check on the program's results. What do you do when the independent check and program don't match. With dbx, you can quickly examine the intermediate results within your program, comparing them to intermediate results in your independent check, or simply looking for wacky numbers. This generally isolates the cause of the bad result faster than staring at several hundred lines of code until you see something wrong.

.How do you get rid of the dbx prompt when you're finished using it? Also, how do I get it to run properly? I tried it on hw3.f and it gave me some trouble.

To get rid of the dbx prompt type "quit". Most programs let you out with either "quit", "end", or "exit". When all else fails try holding down the Control key and hitting C or Z. Control Z just suspends you job, and you may get a message about "Stopped Jobs" when you try to exit from Unix. In this case Unix lets you out, and destroys the stopped job, if you type "exit" a second time.

To get dbx running properly, try keeping a printed copy of the dbx lecture next to you to remind you of key instructions. It would probably also be useful if you run through a few sample dbx sessions with a TA or with me.

Why are different languages used in different fields. Why do EE's use C and other Engineers Fortran? How come no one seems to use Pascal or Quick Basic?

As I've said before different languages permit execution of different classes of programs with varying degrees of difficulty. If I need to do a lot of manipulation of character strings, or need very good access to all of the Unix systems routines, I use C. If I need to do serious number crunching on the fastest machines for simulations of things like fluid flows or the dynamics of mechanical systems, I use Fortran to take advantage of its array constructs and intrinsic functions. At one time I would have chosen Fortran, simply because it was more standardized, but standardization of C is good now too. I don't know of International standards for Pascal or Basic, but they are a different story.

Another major driving force behind use of languages, is history. A language can come to dominate a field simply because most applications in that field have been historically written in that language, and members of that specialty get used to the particular language from modifying these older applications. At some point this accumulation of history can work in the opposite direction. I may remain obscure as a modifier of old codes, but make a major reputation for myself as the first to implement a significant application in a trendy new computer language (regardless of the actual merits of the language). A similar dynamic is at work in the proliferation of computer languages. As a computer scientist I could become a lot more famous inventing a new language, than finding better ways to modify an old one (yes, I'm getting a little cynical in my old age).

However, history and the accumulation of applications tends to dominate. The cost in dollars of converting old applications to a new language is so large, that groups working in a given area train their new workers and students to use the language of history, rather than let them redo old work in whatever language they first learned. If the language of choice is found to be less than adequate, it is easier to extend the language with new standards (e.g. Fortran 90) than to throw out everything and start from scratch with a totally different language. Pascal, Basic and a host of other languages, never really dropped into the historical development of major fields of research at the right times, nor did they generally arrive with significantly advantageous features. However, when the new area of writing software for imbedded systems (things like missile guidance systems, fire control, etc.) took off in the military an opening existed for a new language, and Pascal was reincarnated as Ada.

If you ever want a taste of the costs of converting to different languages, take a look at decisions to acquire new computers within major organizations. It is a general rule that when a change in computers is accompanied by a change in operating system, the cost of converting existing applications to interface with the new operating system exceeds the cost of the new hardware by a substantial factor. Adapting a program to use a new operating system involves only a partial recoding, not a total reprogramming effort.

I've already learned Pascal and loved it, but my teacher told me Pascal is kind of looked down upon and not used. Why?

I'm not the guy to answer that directly, because my Pascal experience has been limited to translating a fairly small amount of code from Pascal to Fortran. However, I will pass on a comment by a heavy duty programmer who I respect. He said Pascal is a language designed to prevent inexperienced programmers from doing things that can get them into trouble. I guess the idea is that it is overly constraining for highly experienced and creative programmers. (This is not always a bad thing) There is probably one other reason that Pascal has fallen from favor. The language called ADA (named after Ada Lovelace, and widely used by the Department of Defense) is a more powerful follow-on to Pascal, and is popular with many former Pascal programmers.

What is the major difference between Fortran and C?

Fortran was written for the express purpose of performing scientific and engineering calculations. C was originally written as a language for building computer operating systems and utilities. As a result C is more versatile at handling complex data structures (the gap is not as large with Fortran 90 as with older Fortran versions), and C has more powerful intrinsic functions for processing input/output, and character strings.

Why not use C instead of Fortran?

For new programs, generally, you can use either. The best reasons to use Fortran on a new program are for scientific applications with data represented by multidimensional arrays (later in the semester), programs that take advantage of special features of Fortran Intrinsic functions, or when you are doing calculations with complex numbers. Fortran has a clear advantage in these areas. Why not give up on Fortran totally and always use C? Two reasons. The quantity of very useful scientific programming applications written in Fortran is huge. They aren't going away soon, and the cost of converting them to C is astronomical. The second reason is that there are a lot of us old but still vaguely productive programmers who will continue to tinker in the language.

How do you know the difference between Fortran 77 and 90?

Anything in Fortran 77 is contained in Fortran 90. You are always writing Fortran 90. I call out features that are specific to 90 (not in 77), because you may be stuck working somewhere with an older compiler.

Can Fortran do Graphics?

No, and Yes. Specific Graphics capabilities are not part of the Fortran standard, but specific Graphics capabilities are not part of the ANSI C standards. Graphics eventually require hardware specific actions to put the right color and/or intensity at a given point on a screen. Functions and subroutines are available on the vast majority of systems that can be called from Fortran to do such jobs with varying levels of convenience. For Workstations, you should look for ways to call XWindows routines for Graphics. For PC's check the manual of your specific Fortran compiler.

You mentioned restart dumps. What are they?

At times I'm in the business of simulating the behavior of Nuclear Power Plants during various accidents. To do this I have to solve time dependent partial differential equations. I'm doing something a lot like the Implicit Euler solution, but it is applied to a coupled system of 6 partial differential equations typically evaluated at 2000-4000 points in the reactor. Think of it as solving a system of 20,000 coupled equations for 20,000 unknowns, at each step in time. As you can imagine, this takes quite a bit of computer time. I might run the program a full day on a computer to see the results of 1000 seconds of the actual accident. So what happens if near the end of the day's computing, the computer develops problems and my program is killed? Not much if I do a restart dump.

I build into my program a subroutine with the job of watching the elapsed time since the job started or since the last restart dump occurred. How long I wait depends on how much of the calculation, I'm willing to redo. Typically I will give it an hour. When the subroutine sees that this time has passed, it calls another set of subroutines that dump the entire state of my system to the end of a disk file. This includes all of the pressures, temperatures, and other necessary fluid and metal properties stored in my blank common (its an old code), and many other variables stored in named commons giving status of valves, numbers of spatial points, and other such information. Unless the hard disk is destroyed, I can recover the state of my calculation later, and continue without re-evaluating too many solution time steps.

In addition to the above wall clock test for restart dumps, I also include tests on "simulation time" (sum of all my time steps in the Euler integration), and some special dumps based on events that I know could cause stability or convergence (of the Newton iteration) problems that would degrade my solution. When I review the results of a calculation, I may see that the numerical solution has silly behavior (usually oscillations associated with numerical instability) beyond some time. I just restart from the last dump before the troubles and calculate that interval with a smaller time step.

Up to other Questions and Answers / Home

Maintained by John Mahaffy :