The 7 Most Important Computer Science Trends in 2020

Here are the 7 fastest-growing computer science trends of 2020. And how these technologies are challenging the status quo in the office and on college campuses.

Whether you’re a fresh computer science graduate or a veteran IT executive, these are the top trends to explore.

1. Quantum computing makes waves

quantum computing chart
“Quantum computing” searches - interest spiked in late 2019 when Google announced it had achieved quantum supremacy.

Quantum computing is the use of quantum mechanics, such as entanglement and superposition, to perform computations. It uses quantum bits (qubits) in a similar way that regular computers use bits.

Quantum computers have the potential to solve problems that would take the world's most powerful supercomputers millions of years.


quantum computing screenshot
IBM’s System One - the first-ever circuit-based commercial quantum computer.

Companies including IBM, Microsoft and Google are all in competition to build reliable quantum computers.

In fact, Google AI and NASA recently published a joint paper that claimed to have achieved "quantum supremacy". This is when a quantum computer outperforms a traditional one at a particular task.

Quantum computers could completely transform data science. They also have the potential to accelerate the development of artificial intelligence, virtual reality, big data, deep learning, encryption, medicine and more. The downside is that quantum computers are currently incredibly difficult to build and sensitive to interference.

Despite current limitations, it's fair to expect further advances from Google and others that will help make quantum computers practical to use. Which would position quantum computing as one of the most important computer science trends of the last few years.

2. Zero Trust becomes the norm

zero-trust-chart.png
“Zero Trust” searches - general awareness of this security concept started to take off in 2018.

Most information security frameworks used by organizations use traditional trust authentication methods (like passwords). These frameworks focus on protecting network access. And they assume that anyone that has access to the network should be able to access any data and resources they'd like.

There's a big downside to this approach: a bad actor has got in via any entry point can then move around freely to access all data. Or delete it altogether.

Zero Trust information security models aim to prevent this. They replace the old assumption that every user within an organization’s network can be trusted.

Instead, nobody is trusted, whether they’re already inside or outside the network. Verification is required from everyone trying to gain access to any resource on the network.

zero-trust-screenshot.png

Huge companies like Cisco are investing heavily to develop Zero Trust solutions.

This security architecture is quickly moving from just a computer science concept to industry best practice. And it’s little wonder why: IBM reports that the average data breach costs a company $3 million in damages.

We will see demand for this technology continue to skyrocket in 2020 as businesses adopt Zero Trust security to mitigate this risk.

3. Cloud computing hits the edge

edge-computing-chart.png

“Edge computing” searches - this market may be worth $3.24 billion by 2025.

Gartner estimates that 80% of enterprises will shut down their traditional data centers by 2025. This is mainly because traditional cloud computing relies on servers in one central location.

If the end-user is in another country, they have to wait while data travels thousands of miles. Latency issues like this can really hamper an application’s performance.

So companies are moving over to edge computing service providers instead.

Modern edge computing brings computation, data storage, and data analytics as close as possible to the end-user location. And when edge servers host web applications the result is massively improved response times.



edge-computing-screenshot.png
According to Wired, approximately 10% of web traffic now goes through CloudFlare.

As a result, some estimates suggest that the edge computing market will be worth $3.24 billion by 2025.

And Content Delivery Networks like Cloudflare that make edge computing easy and accessible will increasingly power the web.

4. Kotlin overtakes Java
kotlin-chart.png

“Kotlin” searches - interest in this programming language rocketed in 2017 and growth hasn't slowed down since.

Kotlin is a general-purpose programming language that first appeared in 2011. It’s designed specifically to be a more concise and streamlined version of Java.

And so it works for both JVM (Java Virtual Machine) and Android development.

kotlin-screenshot.png

Kotlin is billed as a modern programming language that makes developers happier.

There are over 7 million Java programmers in the world right now. And since Kotlin offers big advantages over Java, we can expect a lot of these to transition over in 2020.

Google even made the announcement in 2019 that Kotlin is now its preferred language for Android app developers.

5. The web becomes more standardized
openapi-specification-chart.png

“OpenAPI Specification” searches - OpenAPI became a separate project from the Swagger framework in 2016.

REST (REpresentational State Transfer) web services power the internet and the data behind it. But the structure of each REST API data source varies wildly. It depends entirely on how the individual programmer behind it decided to design it.

The OpenAPI Specification (OAS) changes this. It’s essentially a description format for REST APIs.

Data sources that implement OAS are easy to learn and readable to both humans and machines. This is because an OpenAPI file describes the entire API, including available endpoints, operations and outputs.

This standardization enables the automation of previously time-consuming tasks.

For example, tools like Swagger generate code, documentation and test cases given the OAS interface file. This can save a huge amount of engineering time both upfront and in the long run.

Another technology that takes this concept to the next level is GraphQL. This is a data query language for APIs developed at Facebook.

It provides a complete description of the data available in a particular source. And it also gives clients the ability to ask for only the specific parts of the data they need and nothing more.



open-api-screenshot.png

GraphQL is a query language for APIs and a runtime for fulfilling those queries with your existing data.

It too has become widely used and massively popular. Frameworks and specifications like this that standardize all aspects of the internet will continue to gain wide adoption.

6. More digital twins

digital-twin-chart.png

“Digital twin” searches - interest has been steadily growing over the last 5 years.

A digital twin is a software representation of a real-world entity or process, from which you can generate and analyze simulation data. This way you can improve efficiency and avoid problems before devices are even built and deployed.

GE is the big name in the field and has developed internal digital twin technology to improve its own jet-engine manufacturing process.


digital-twin-screenshot.png

GE's Predix platform is a huge player in the digital twin technology market.

This technology was initially only available at the big enterprise level, with GE’s Predix industrial Internet of Things (IoT) platform.

But now we’re seeing its usage permeate across other sectors like retail warehousing, auto manufacturing, and healthcare planning.

Yet case studies of these real-world use cases are thin on the ground, so the people that produce them will set themselves up as industry experts in their field.

7. Demand for cybersecurity expertise skyrockets
hack-the-box-chart.png

“Hack The Box” searches - demand for this cybersecurity platform is showing exponential growth.

According to CNSET, at least 7.9 billion records (including credit card numbers, home addresses and phone numbers) exposed through data breaches just in the last year.

As a consequence, large numbers of companies seek cybersecurity expertise to protect themselves.

Hack The Box is an online platform has a wealth of educational information and 112 cybersecurity-themed challenges. And they have 290,000 active users that test and improve their skills in penetration testing.

So they’ve become the go-to place for companies to recruit new talent for their cybersecurity teams.

hack-the-box-screenshot.png

Hack The Box is a hacker haven both in terms of content and design.

And software that helps people to identify if they’ve had their credentials compromised by data breaches will also trend.

One of the most well-known tools currently is Have I Been Pwned. It allows you to search across multiple data breaches to see if your email address has been compromised.

That's the list

That's our list of the 7 most important computer science trends to keep an eye on in 2020.

This is an exciting time to be in the computer science field.

CS has always been a rapidly-changing field.

But with the growth of completely new technologies (especially cloud computing and machine learning), it's fair to expect that the rate of change will increase in 2020 and beyond.

Published Date: 
April 6, 2020
Josh Howarth
Co-founder of Exploding Topics.

Get Weekly Trending Topics

Our best Exploding Topics, plus expert analysis.

privacy noticeterms of service
© 2020  Exploding Topics is a trademark of Backlinko LLC