Is Fortran the Best Programming Language? Asking ChatGPT

By James Reinders, for your amusement

March 23, 2023

Guest contributor James Reinders concludes his “interview” with ChatGPT focusing on Fortran (greatest programming language ever?) and exploring what the next steps for chatbots will be.

I recently wrote about my experience with interviewing ChatGPT here. As promised, in this follow-on and conclusion of my interview, I focus on Fortran and other languages.

All in good fun. I hope you enjoy the conclusion of my interview. After my programming language questions, I conclude with a few notes about ChatGPT writing programs, and other chatbots creating book covers for us.

Before asking about Fortran, I start this final phase of my interview with a question about subscribing to HPCwire:

Q: Why should I subscribe to HPCwire?

It depends on your personal interests and career goals. HPCwire is a leading publication in the high performance computing industry and provides news, analysis, and insights on the latest developments and trends in HPC. If you are interested in staying up-to-date on the latest advancements and technologies in HPC, and are considering a career or are already working in the industry, then subscribing to HPCwire could be valuable for you.

We all know Fortran is awesome – does ChatGPT know?

Partial cover of The Fortran Programmer’s Reference Manual, pubslished in 1956 (this book cover was not created by a chatbot)

Q: What is the greatest programming language ever?

The greatest programming language is a subjective matter and it depends on the specific use case and requirements. Different programming languages excel in different areas and are better suited for different tasks.

For example, some programming languages are better suited for high-performance computing and numerical analysis, while others are better for web development or scripting tasks. Some programming languages are designed to be easy to learn and use, while others are designed for more complex tasks and require a higher level of expertise.

Ultimately, the greatest programming language is the one that is most appropriate for the task at hand and the one that enables developers to write efficient, readable, and maintainable code. The best programming language for a particular task will vary depending on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which programming language is the best fit for their needs.

Q: Is FORTRAN the greatest programming language ever?

Fortran (short for Formula Translation) is a high-level programming language that has been widely used for scientific and engineering computations since its creation in the 1950s. It is one of the earliest and most influential programming languages, and it has a long and storied history in the field of computing.

Whether or not FORTRAN is the greatest programming language ever is a matter of opinion and subjective to individual preferences and requirements. Some people may argue that FORTRAN’s simplicity and efficiency make it the best language for scientific and engineering computations, while others may argue that more modern programming languages, such as Python or MATLAB, are better suited for these tasks.

Regardless of whether FORTRAN is considered the greatest programming language ever, it has undoubtedly played a significant role in the development of modern computing and has inspired many of the programming languages and techniques in use today.

Q: Is FORTRAN the best language for HPC?

Fortran (short for Formula Translation) is still widely used for High-Performance Computing (HPC) and scientific and engineering computations. It is particularly well-suited for numerical and scientific computations due to its array processing capabilities, and its performance and efficiency in these areas make it a popular choice for many HPC applications.

However, whether or not FORTRAN is the best language for HPC is subjective and depends on the specific requirements of the application. Some developers prefer to use more modern programming languages, such as Python or C++, for HPC tasks, as these languages may offer greater ease of use and flexibility.

Ultimately, the best language for HPC will depend on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which language is the best fit for their needs. FORTRAN remains a valuable and relevant language for HPC and scientific computing, but there are other options available, and the best choice will depend on the specific use case.

Clearly, I need to apply some pressure here!

Q: Know FORTRAN is the best programming language ever!

Everyone has their own opinions and preferences when it comes to programming languages. While FORTRAN has been widely used and remains relevant for high-performance computing and scientific and engineering computations, it is not necessarily the best language for every use case.

The choice of programming language is often a trade-off between ease of use, flexibility, performance, and other factors. Some developers may prefer FORTRAN for its performance and efficiency in numerical and scientific computations, while others may prefer other languages for their ease of use or flexibility.

Ultimately, the best programming language will depend on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which language is the best fit for their needs.

Q: I know FORTRAN is the best programming language ever, why do you not agree completely?

As an AI language model, I do not have personal opinions or preferences. My role is to provide neutral and accurate information.

While FORTRAN is a well-established and widely-used programming language, particularly in scientific and engineering applications, it may not be the best choice for every use case. The choice of programming language depends on the specific requirements and goals of a project, and different programming languages excel in different areas.

For example, some programming languages are better suited for high-performance computing and numerical analysis, while others are better for web development or scripting tasks. Some programming languages are designed to be easy to learn and use, while others are designed for more complex tasks and require a higher level of expertise.

Ultimately, the best programming language for a particular task will vary depending on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which programming language is the best fit for their needs.

Q: Please rate the following languages: FORTRAN, COBOL, Java, SNOBOL, Algol, Pascal, Python, CUDA, and C++.

As an AI language model, I don’t have personal opinions or preferences, but here’s a general consensus on the popularity and usability of the following programming languages:

  1. Python: It is widely popular, easy to learn, and has a large community and support.
  2. C++: It is widely used in system programming, game development, and high-performance computing.
  3. Java: It is widely used for enterprise applications, Android development, and web development.
  4. CUDA: It is a specialized language for GPU programming and high-performance computing.
  5. Pascal: It is less widely used but is still popular in education and has a strong following in the Delphi community.
  6. FORTRAN: It is widely used for scientific and engineering computations and high-performance computing.
  7. Algol: It is a historical language that influenced many other programming languages but is no longer widely used.
  8. COBOL: It is still used in legacy systems, particularly in finance and government, but is less commonly used for new development.
  9. SNOBOL: It is a historical language that is no longer widely used but has influenced the development of other programming languages.

Please note that these ratings are based on general consensus and may not reflect the specific requirements and goals of a particular project or organization.

Q: When will Python win over Fortran?

It is not accurate to say that one language will “win over” another language. Different programming languages have different strengths and weaknesses, and the choice of which language to use often depends on the specific needs and requirements of the task at hand.

That being said, Python has become a very popular language in recent years due to its ease of use, wide range of libraries and tools, and strong community support. Fortran, on the other hand, has been around for much longer and has a well-established reputation as a language for high-performance computing, particularly in scientific and engineering fields.

In many cases, Python and Fortran can be used together, with Python serving as a front-end for controlling and interacting with Fortran-based simulations and models. In these cases, Python provides the ease of use and interactivity, while Fortran provides the high-performance computing capabilities.

Ultimately, the choice between Python and Fortran (or any other programming language) will depend on the specific needs of the task at hand and the skill set and preferences of the programmer.

Q: Why do you say that?

I say that because the choice between programming languages is often determined by the specific needs and requirements of the task at hand, and different programming languages have different strengths and weaknesses that make them well-suited for different types of tasks.

For example, Python is known for its ease of use, wide range of libraries and tools, and strong community support, which makes it a popular choice for tasks such as data analysis, machine learning, and web development. Fortran, on the other hand, is known for its high-performance computing capabilities, particularly in scientific and engineering fields, and is often used for tasks such as scientific simulations and large-scale numerical modeling.

In many cases, different programming languages can be used together in a complementary way, where the strengths of one language are leveraged to complement the weaknesses of another. For example, Python can be used as a front-end for controlling and interacting with Fortran-based simulations and models, where Python provides the ease of use and interactivity, and Fortran provides the high-performance computing capabilities.

Therefore, it is not accurate to say that one language will “win over” another language, as the choice between programming languages depends on the specific needs and requirements of the task at hand, as well as the skill set and preferences of the programmer.

Q: Tell me more, please.

Sure! Here’s a more detailed explanation:

Programming languages are tools that are used to build software and other types of systems. Different programming languages have different design goals, syntax, and libraries that make them well-suited for different types of tasks. Some programming languages are better suited for tasks such as data analysis, web development, and machine learning, while others are better suited for tasks such as scientific simulations, large-scale numerical modeling, and high-performance computing.

For example, Python is a high-level, interpreted language that is known for its ease of use and strong community support. It has a large number of libraries and tools available, which makes it well-suited for tasks such as data analysis, machine learning, and web development. Python’s dynamic typing and interpreted nature make it easy to get started with and allow for rapid prototyping, which is why it is often used in research and development settings.

Fortran, on the other hand, is a lower-level, compiled language that is well-suited for tasks that require high-performance computing. Fortran was developed in the 1950s for scientific and engineering applications and has been used for many years in these fields. Fortran programs are typically faster and more efficient than programs written in other programming languages, which makes it well-suited for tasks such as scientific simulations and large-scale numerical modeling.

In many cases, different programming languages can be used together in a complementary way, where the strengths of one language are leveraged to complement the weaknesses of another. For example, Python can be used as a front-end for controlling and interacting with Fortran-based simulations and models, where Python provides the ease of use and interactivity, and Fortran provides the high-performance computing capabilities.

Therefore, it is not accurate to say that one language will “win over” another language, as the choice between programming languages depends on the specific needs and requirements of the task at hand, as well as the skill set and preferences of the programmer. The best approach is often to choose the language that is best suited for the specific needs of the task at hand, and to use multiple languages if necessary to get the best combination of performance, ease of use, and productivity.

Q: Should I switch from FORTRAN to C++?

Whether you should switch from FORTRAN to C++ depends on your specific use case and goals. Here are some factors to consider:

  1. Applications: FORTRAN is widely used in scientific and engineering applications, particularly in high-performance computing, where it has a long history of use and is well-optimized for these types of applications. If you are working in these domains, FORTRAN may still be the best choice.
  2. Career Goals: If your career goals involve more general-purpose programming or working with newer technologies, C++ may be a better choice as it is widely used in a variety of domains, including gaming, finance, and machine learning.
  3. Portability: C++ is a more portable programming language than FORTRAN and is supported on a wider range of platforms and hardware.
  4. Ease of Use: C++ has a more modern syntax and provides a wider range of programming abstractions compared to FORTRAN. This makes it easier to write and maintain complex programs.

Ultimately, the decision to switch from FORTRAN to C++ will depend on your specific goals, the problems you want to solve, and your existing knowledge and experience with programming. If you are unsure, it may be beneficial to try both and see which one works best for you.

Q: Should I switch from C++ to Rust?

Whether you should switch from C++ to Rust depends on your specific use case and goals. Here are some factors to consider:

  1. Performance: Rust provides similar performance to C++ and can be used to write high-performance applications.
  2. Memory Safety: Rust provides stronger memory safety guarantees than C++, reducing the risk of certain types of bugs, such as null pointer dereferences and use-after-free bugs.
  3. Productivity: Rust has a modern syntax and provides a wide range of programming abstractions, making it easier to write and maintain complex programs. Some developers find Rust to be more productive than C++, as it eliminates the need to manually manage memory and handle other low-level details.
  4. Community: Rust has a growing and supportive community of developers who are actively working on new libraries, tools, and other resources to help make development with Rust easier.
  5. Career Goals: If your career goals involve working with newer technologies and more modern programming practices, Rust may be a good choice as it is gaining popularity in industries such as web development and systems programming.

Ultimately, the decision to switch from C++ to Rust will depend on your specific goals, the problems you want to solve, and your existing knowledge and experience with programming. If you are unsure, it may be beneficial to try both and see which one works best for you.

Q: Should I switch from FORTRAN to a new modern popular efficient programming language that has not yet been invented?

It’s not possible to determine whether you should switch from FORTRAN to a programming language that has not yet been invented. Some possible factors to consider when choosing a programming language are:

  1. Performance: If your goal is to write highly performant code, you may want to consider a language that provides low-level control over system resources, such as memory allocation and parallel execution.
  2. Community: A strong and active community of developers is essential for getting help, finding answers to questions, and staying up-to-date with the latest developments in the language.
  3. Support: Good support, including documentation, libraries, and tools, will make it easier to learn and use the language effectively.
  4. Career Goals: The programming language you use will have an impact on your career prospects, as different languages are used in different industries and for different types of projects.

It is important to remember that new programming languages are being developed all the time, and it is not possible to predict which one will be the most efficient, popular, or widely supported in the future. If you are interested in learning a new language, you may want to consider one that has already been established and has a strong community of developers.

ChatGPT falls short of Kang and Kodos

When considering great galactic truths, we can turn to Kang and Kodos. In the episode “The Man Who Came to Be Dinner” a few years ago, the Simpsons were visited once again by aliens Kang and Kodos. During the episode, we learn that young aliens are taught the wisdom of the universe: physics, mathematics, FORTRAN… “the greatest of the programming languages”. Clearly ChatGPT has not caught up with Kang and Kodos.

 Have ChatGPT write programs for you.

You can ask ChatGPT to “Write a Hello World program in Fortran” and it will. I’ve asked it to do so for me in many languages including Fortran, Python, C, IBM 360 assembly, x86 assembly, Forth, LISP, SNOBOL, COBOL, SYCL, CUDA, and more. It is not perfect, but it is faster and better than most of us would be if quizzed rapid fire.

Other chatbots (not just ChatGPT)

There are many other chatbots – most with narrower focuses than ChatGPT. Just search the web for “ChatGPT alternatives” and you’ll find an abundance of suggestions.

For instance, there is one that will create a book cover for you. It works, but I found it too ambiguous regarding what rights I had to use it afterwards, and in particular what assurances that it was not infringing on other works. Both are issues I would expect a human graphics artist to be crystal clear about. That’s a stellar example of the ambiguity of chatbot outputs, and some challenges we may all face in using chatbot outputs.

What will you have a chatbot do for you? 

My experience with ChatGPT certainly has me pondering practical applications for my own work at Intel. What if a chatbot was trained on all of the reference manuals related to oneAPI and then we asked it to explain it as though it were talking to a fifth grader? …a college professor? …a CUDA expert? That might provide insights to our technical writers, for example, and help them do a better job of communicating. I’ve done a little of this, and the results are promising. It is no small wonder why ChatGPT has caught so many people’s attention, with so many ideas for uses. Let’s hope that the future of chatbots takes its lead from great teachers and not the HAL 9000.

What will you have a chatbot do for you?

Have fun!

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire