Intelligence is capacity to handle complexity. Someone with higher IQ is able to grasp more complex problem, in other words is able to "hold more pieces in their head at time". Intelligence cannot be trained or learned, it is set genetically.
We may think of IQ as ability to lift a mental weight. If someone is able to lift 100 kg, it does not matter what type of weight it is until the weight is lower or equal than 100 kg. To lift more than that, tools are needed.
Lets rate a problem complexity (PC) and intelligence of a problem solver (IQ). If PC < IQ then effort needed to solve a problem will grow linearly with problem's complexity. If PC > IQ then effort needed to solve a problem will grow exponentially. In the second case, we can think about cognitive overload.
Software development is a process of describing (complex) behaviour in form of computer code. Result of this process is are applications.
With time the complexity of applications tends to grow. We can think of this as low hanging fruit in terms of simple applications was already picked. Today only complex applications are to be developed. Of course, part of this trend is caused by having more possibilities to develop complex software due to better hardware, and other inventions in IT field.
This process of growth in complexity is obviously not specific just to software development. It can be observed in other fields as well. Cars, electronics, appliances, buildings, or the World in general is more complex than it was hundred years ago.
To handle complexity in software development higher than IQ, tools and processes are needed. Aim of such tools and processes is to keep development in the area of PC < IQ, to make it effective and less risky.
Similarly to the previous point, tools and processes used to handle complexity are not exclusive to software development. For example we all organise our clothes in a closet to be able to find a t-shirt quickly in a more effective way. If we try to build a house, we use certain processes, developed over the centuries, which helps us to build it faster than if we had to "invent everything from scratch".
It is tempting not to use some of these tools and processes, to seemingly make development faster, especially in the early phases of a project. Usual reasoning behind such descision is to save time and cost. This may be true for smaller size applications. But from a certain level of complexity, if tools and processes are not implemented, effort and cost needed to expand an application grows exponentially.
Examples of tools and processes in software development on different levels of detail:
Modern programming languages are being developed with complexity reduction and productivity improvement in mind. These languages constrains developers in some way, limiting their options to make mistakes. For example by making mutability or nullability as non-default behaviour. Or by incorporating ideas from functional programming, thus reducing side effects of a code.
This trend can be observed in languages like Kotlin (evolution step in JVM languages) which introduced immutable data classes, safer switch statements, or a keyword for immutable variables. Kotlin Flows can be understood as an idea adopted from functional programming, making parallel programming more accessible.
Swift or Golang languages, among other things, makes error handling a mandatory part of a method API so a developer doesn't have to think whether a method call can fail unexpectedly (think unchecked exceptions in Kotlin).
Programming paradigms like Object-oriented Programming (OOP) is a way of organising code, hiding (encapsulating) internal details of the code behind object APIs. Again, limiting developer's options. This reduces quantity of moving parts in the code, which allows developers to spend their brain capacity on higher level designs and ideas, while ignoring hidden details.
OOP can be considered evolutionary step in the area of programming paradigms. As software complexity grows, it is expected that new paradigms will be invented.
Modularisation is another way of organising code on a higher level.
In the World of JVM languages, we can think of modules provided by various build systems (Gradle, Maven), or by language itself (Project Jigsaw). Such modules hides implementation details behind an API, reducing amount of moving parts in the application.
As time goes, we can see that modularisation tools are being more restrictive, similarly to programming languages. Think of recent shift towards non-transitive dependencies in Gradle, which further limits amount of moving parts in the application.
Automated testing is a tool which allows developers to make changes in an application without breaking other parts of an application.
Tests give developers confidence to make changes, without having to think about all pieces in the application. Allowing them to focus on important pieces of code.
Of course testing allows eventual issues to be discovered soon, and to lower expense needed for potential debugging.
Agile way of software development is a managerial tool that allows teams to reduce risks resulting from project complexity.
Key ideas of "agile" is to have short development and early feedback to reduce losses from "going in the wrong direction". Longer development cycles cost more money, because of time spent by people involved. As mentioned, there is a possibility that functionality developed during a cycle is not well understood. When such issues are discovered, usually during a delivery to customer, longer development cycles become more expensive.
Several methodologies formalised the agile way of development. One of the most popular is Scrum, which can be thought of as a collection of best practices in agile. For example Scrum defines a sprint (usually few weeks) as an optimal time period in which a team can develop a functionality without being overwhelmed by communication with a customer, at the same time period not being a too long so the cost of "going the wrong way" is limited. Compare it to more traditional approaches (Waterfall), where feedback is provided after longer periods of time, making potential loses caused by "going the wrong way" higher.
When implementing tools and processes with a goal of complexity reduction, no cost-benefit analysis is done beforehand. Instead such tools and processes are accepted as a set of best practices and its impact is evaluated after the fact. Tools and processes are incrementally tweaked and improved to evolve within a team or a company.
It is hard to exactly calculate impact of introducing automated tests into a development cycle. So adoption of such process is usually done based on a shared experience in the software development field. Its benefits are calculated after the fact, by various metrics.
Let's make a comparison to a mechanical engineer, who doesn't ask whether he or she should draw a schematics as a way of communication with machinists, but decides which CAD software to use. Similarly, in software projects with higher level of complexity, question is not to whether automated testing should be incorporated, but what tools should be used to make development more effective.
Today, we encounter computers and software in every way of life. Software we develop today is more complex than software developed a decade or two ago. If we extrapolate this trend into the future it is safe to say that software complexity will grow.
To handle this growing complexity, the software development field will have to evolve. New, more effective tools and processes will have to be developed. Even though what we have today, is far better than what we had a few decades ago, current state of the field is just an evolutionary step. On that note, companies and teams who will adapt and evolve those tools will stay competitive. Those who don't will go out of business.
On a personal level, I tend to adopt such tools and processes and develop new to increase my productivity. So, with my average IQ I can stay relevant in the field of software development which tends to attract people with above average intelligence.