From owner-sc22wg5+sc22wg5-dom8=www.open-std.org@open-std.org  Fri Jul  8 05:37:25 2016
Return-Path: <owner-sc22wg5+sc22wg5-dom8=www.open-std.org@open-std.org>
X-Original-To: sc22wg5-dom8
Delivered-To: sc22wg5-dom8@www.open-std.org
Received: by www.open-std.org (Postfix, from userid 521)
	id C835F358700; Fri,  8 Jul 2016 05:37:25 +0200 (CEST)
Delivered-To: sc22wg5@open-std.org
Received: from mail.jpl.nasa.gov (smtp.jpl.nasa.gov [128.149.139.109])
	(using TLSv1 with cipher DHE-RSA-AES256-SHA (256/256 bits))
	(No client certificate requested)
	by www.open-std.org (Postfix) with ESMTP id 42896356ECB
	for <sc22wg5@open-std.org>; Fri,  8 Jul 2016 05:37:19 +0200 (CEST)
Received: from [137.79.7.57] (math.jpl.nasa.gov [137.79.7.57])
	by smtp.jpl.nasa.gov (Sentrion-MTA-4.3.1/Sentrion-MTA-4.3.1) with ESMTP id u683bDF2010465
	(using TLSv1.2 with cipher ECDHE-RSA-AES128-GCM-SHA256 (128 bits) verified NO)
	for <sc22wg5@open-std.org>; Thu, 7 Jul 2016 20:37:16 -0700
Subject: Re: (j3.2006) (SC22WG5.5759) [ukfortran] RE:  RE:   Units of
 measure
From: Van Snyder <Van.Snyder@jpl.nasa.gov>
Reply-To: Van.Snyder@jpl.nasa.gov
To: WG5 <sc22wg5@open-std.org>
In-Reply-To: <20160708013739.CF48735723B@www.open-std.org>
References: <20160619135920.D0F3F358287@www.open-std.org>
	 <20160629112043.BF09F3587AF@www.open-std.org>
	 <20160629123517.185A635828D@www.open-std.org>
	 <20160629190123.72A8035859B@www.open-std.org>
	 <20160702105054.18596358745@www.open-std.org>
	 <20160702202059.B9618358745@www.open-std.org>
	 <20160703085848.B63663584A2@www.open-std.org>
	 <20160705153207.E49A9358343@www.open-std.org>
	 <20160705173722.4977735852E@www.open-std.org>
	 <20160706152553.105A89DB160@www.open-std.org>
	 <20160706164712.D056F9DB160@www.open-std.org>
	 <1467851790.3820.152.camel@vanlap.vsnyder>
	 <20160707184955.C1409358287@www.open-std.org>
	 <20160708013739.CF48735723B@www.open-std.org>
Content-Type: text/plain; charset="ISO-8859-1"
Organization: Yes
Date: Thu, 07 Jul 2016 20:37:13 -0700
Message-ID: <1467949033.25709.149.camel@math.jpl.nasa.gov>
Mime-Version: 1.0
X-Mailer: Evolution 2.32.3 (2.32.3-36.el6) 
Content-Transfer-Encoding: 7bit
X-Source-Sender: Van.Snyder@jpl.nasa.gov
X-AUTH: Authorized
Sender: owner-sc22wg5@open-std.org
Precedence: bulk

On Fri, 2016-07-08 at 10:37 +0900, Cohen Malcolm wrote:

> Doing it via software tools has the significant advantages of
> (a) no impact on vendor implementations,
> (b) works even for compilers whose vendors have not provided it,
> (c) only takes a year or two to obtain, instead of more than 10 years via 
> the standard (or even more than that, seeing as how here we are in 2016 but 
> few compilers yet implement all of the standard we started working on in 
> 1998 and published in 2004!).

Why were coarrays not done in this way?  How about C interop?  PDTs?
What was the point of DO CONCURRENT, especially given that OpenMP
directives do the same thing?  Do OpenMP directives have any impact on
vendor implementations, or are they handled by a preprocessor?

Doing anything via software tools has the significant disadvantage that
the lifetime of support of the tools bears no relationship to the
lifetime of support that a language feature would have.  When Fujitsu
said "nothing newer than Fortran 95 for Intel processors, no 64-bit
product, and no Linux product maintenance," and Tom Lahey retired, we
switched to other compiler vendors without any trauma.  That's not an
option if the software tool company goes out of business, and there
aren't any others who offer the same functionality.  Maybe if it were
done by directives that the processor handled....

The things published in 2004 but not yet universally available were much
more difficult than the units proposal.  Indeed, coarrays were much nore
difficult, but they're widely available.  Nice try at a straw man,
though.

We did ask for this in 1986.  It took me eleven more years to get
funding to join J3.  The powers-that-be at JPL didn't believe us in 1986
that automatic units checking might avert a catastrophe.

> There are software companies which provide software tools for Fortran.  NAG 
> is such a company.  As I mentioned before, I am sure that NAG would be 
> prepared to design the directives and implement tools for units handling, 
> and almost certainly for less than 1% of $300M.  So far, neither I nor 
> anyone else at NAG has received even a hint of interest in such a thing.

A company that builds software tools (not NAG) contacted us.  We were
expected to bear the entire development expense.  Then they would sell
the tool (and sell maintenance and support to us) and we would not share
in the profits.  We weren't sure whether Congress would stand for it.
It looked too much like a "Solyndra" scam.  Maybe it would sneak
through.  Our Ethics office said "don't touch it."  We didn't want to
take the chance.  Somebody might have gotten fired, or even prosecuted.

We used a software tool for a different purpose, from a vendor who had
developed it at their own expense.  A few years after we became
essentially totally dependent upon it, we found a few bugs and some
deficiencies.  The company was out of business.  Working around the bugs
wasn't free.  Developing a simpler but nontrivial tool to overcome the
deficiencies by post-pre-processing its output wasn't free.  We were
happy to escape from it, but that wasn't free either.  It wasn't obvious
it had reduced our long-term labor cost.

> <<<
> If your organization had lost $300 million due to a trivial software mistake 
> that the programming language and its runtime could have caught or corrected 
> automatically, would you roll over and play dead?
> >>>
> 
> No, I would build, or contract to be built, a tool for checking code that is 
> annotated with units information, for the reasons given above.

Those who read the proposal might have noticed that it doesn't just do
checking.  Definition of a conversion unit creates a trivial generic
function and its inverse.  Call that "code generation" if you like, but
I expect it would be done long before the real "code generator" comes
into play.  Nothing like the code generation necessary for a coindexed
reference, or DO CONCURRENT.  That means the tool would be a
preprocessor, not an analysis tool.  Conversion is not automatic; it's
explicit.  It looks just like a function reference, and that's what the
proposal calls it.  I don't think of that as a new code generation
problem.  A preprocessor would probably need to inline the conversion
functions, because they couldn't be internal functions in an internal
procedure if a conversion unit is defined in an internal procedure.  I'm
not sure how a preprocessor would or could handle units checking and
conversion during formatted input.

While we waited during the interregnum between Fortran 77 and the
availability of Fortran 90 processors, we used a preprocessor to provide
a more comprehensive set of control structures, and a limited form of
internal procedures (we actually started it before Fortran 77).  There
was a love-hate relationship with it for decades.  Debugging was tedious
because line numbers were different in the source and generated code.
Error messages from the compiler were difficult to attach to the right
place.  Source-level debuggers were pointless, because they only worked
on the (ugly) generated code.  It stuck around for more than ten years
after Fortran 90 compilers became available, long after the primary
developer had retired.  Six million lines of navigation software used
it.  It might have been part of the reason that powers-that-be decided
to re-code that software in C++ instead of converting it to real Fortran
syntax.  It's not obvious it reduced our overall long-term labor cost.
But the code looked nicer than Fortran 77.

Concerning preprocessors, we got a very strong "never again" signal.
Analysis tools are useful, and we use some.

> Cheers,


