Skip to content

The top 3 technological trends that will dominate 2024

Opinion Articles

Pedro Varela

Bliss Applications

An opinion article named “The top 3 technological trends that will dominate 2024” by Pedro Varela, Bliss’s Head Commercial and Partnerships

Every year, technological trends emerge and endless lists of areas, tools, systems, etc., which in the new year will dictate progress, innovation, and evolution on the right path of technology and derivatives. Last year I talked about six technological trends, but this year I only intend to focus on three to be more objective.

 

I’ll be talking about generative artificial intelligence, quantum computing, and sustainable technology.

Generative artificial intelligence can no longer be considered a trend; however, the industry, in a generalized way, will start using it daily in different ways. Large natural language models will play a dominant and fundamental role in our productivity, whether in the basic interface with the user, at the level of natural language, or in more complex daily tasks. Artificial intelligence (AI) will become a constant in our products. The trends from previous years, like series and movie recommendations on streaming platforms or the photos we take on our mobile phones that use neural networks, will be common now and, in some cases, mandatory. 

 

According to Gartner research, 40% of business applications will have conversational AI (e.g., chatbots with natural language), and 60% of marketing departments will use some form of generative AI. Another trend this year, within this area, is security in the use of artificial intelligence. The subject will thicken, either by the acts already approved by the European Union or by the global consortia that are working in this area, so that the security use of AI is regulated.

Quantum computing, which was already a slight trend in 2023, is entering a new phase. There is no doubt that IBM, Microsoft, Google, or PsiQuantum hope that it will be possible this year to move from a proof of concept (POC) to “normal” use and start being integrated, gradually, and consumed in our systems (and, of course, new systems). 

 

Within this theme and looking at what could be one of the greatest computer revolutions ever, I recommend the book by Michio Kaku, professor of theoretical physics at the City University of New York and co-founder of string field theory, Quantum Supremacy, which outlines the implications of the use of quantum computing in the future as a mind-blowing change in technology.

 

Where we can already verify its use this year is in the discovery of new drugs, improvement of human genome sequencing, optimization of complex systems, search for life outside the earth, and cryptography (increase in security).

Sustainable technology, more than a trend, will be a concern that will be at the center of large corporations and governments around the world to achieve the famous “net zero” goal. So far, much of what we have seen is easier to announce than to achieve, and the reduction of carbon emissions will play a fundamental role in technological services and products. The companies that lead and are relevant and fundamental in our lives are moving very strongly in this direction. We recently saw in Apple’s presentation the Apple Watch product line with its first carbon-neutral products (the goal is for all to be by 2030). 

 

The term “green cloud computing” will be more present in technological conversations. Infrastructure and services will prioritize reducing energy consumption and carbon emissions, and sustainable applications will be designed to help us live life more ecologically. This includes the need to develop more sustainable and ethical methods in the extraction of materials necessary for the manufacture of devices or in the design of new structures that derive from the evolution of consumer habits.

 

The original version in Portuguese was released on SapoTEK.

 

Related articles

Opinion Articles
RxRepository: Building a testable, reactive, network data repository using RxSwift (part 1)

In this series we will tackle the problem of optimizing network access to fetch data from the network, a common theme of networked applications. While it is certainly trivial to fetch data from a server in any modern framework or OS, optimizing the frequency of access to the network, in order to save bandwidth, battery, user frustration, amongst other things, is complex. More so if you want to reduce code duplication, ensure testability, and leave something useful (and comprehensible) for the next engineer to use.

Opinion Articles
RxRepository: Building a testable, reactive, network data repository using RxSwift (part 2)

In part 1 of this series we started tackling a common problem of networked applications, that of optimizing resource usage and user experience, by optimizing network access. We typically do that by avoiding expensive resource usage, as in avoid making network calls. This avoidance is not more than a mere compromise on the type of resource we decide to spare. Trade a network call for memory space, by caching network responses. It also comes with constraint relaxation, as we do not need the latest version of a particular resource. We, thus, avoid a network call. Nevertheless we want that what we have cached to eventually expire, or to be able to forcefully reload a resource.