You are here
By one estimate, more than 100 billion lines of code are released each year, with an ever-increasing proportion of that software connected to the internet. With more connected code, there’s more risk of hackers connecting to that code for their own nefarious ends.
Given this opportunity for compromised code, bug-bounty programs are booming. Although positive, they’re just one component of how to deliver superior security.
IBM’s new CloudNativeJS project seeks to help developers build and deploy cloud-native Node.js applications via Docker containers and Kubernetes orchestration.
The CloudNativeJS project’s current assets include:
Originally, the only way to create WebAssembly (or WASM for short) was to compile C/C++ code to WebAssembly using the Emscripten toolchain. Today, not only do developers have more language options, but it has become easier to compile these other languages directly to WebAssembly, with fewer intervening steps.[ What’s next for WebAssembly. • 8 projects that give WebAssembly a lift. | Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ]
In this piece, we’ll examine the steps required to implement WebAssembly components in a web app. Because WebAssembly is a work-in-progress, the steps are highly dependent on which language you use, and the toolchain is likely to keep changing for some time. But right now, it’s possible to write and deploy useful, if minimal, WebAssembly applications in a number of languages.
The scary stories from the web are getting worse. First there were a few stolen credit card numbers. Then there were a few thousand. Now we hear about millions of financial records being exposed by security breaches, and we grow numb to the potential threat. Credit card numbers barely scratch the surface of what the bad guys are after, and there are more dangerous stories that come out of the labs studying cyber war.
Writing secure code begins long before the first loop is formed—and is no easy task. To even approximate bulletproof code, architects, engineers, auditors, and managers must try to imagine everything that could go wrong with every aspect of the code. Although it’s impossible to anticipate every nasty curve the attackers will throw, you have to do all you can to reduce your attack surface, plug holes, and guard against the fallout of a potential breach.
GitLab, a devops platform based on the Git software version control system, gains increased visibility into security with its Version 11.1 release, as well as other enhancements.
The new security dashboard reports on the latest security status of each project’s default branch. Security teams can determine if something is wrong and take actions if needed. The dashboard can be used to dismiss false positives or create issues to solve vulnerabilities. Teams can also adjust the criticality weight of vulnerabilities. The security dashboard resides in the Project menu of a project’s side navigation.
The predictions about 2019 and 2020 cloud computing are starting to come out, and I don’t see anything that isn’t already obvious.
Predictions like the “growth of cloud services” and “security will be more important” are so obvious that only those in an induced coma would not see them coming. Geez guys, you’re better than that.[ What is cloud computing? Everything you need to know now. | Also: InfoWorld helps you identify the right tools for the job: AWS cloud services guide • Microsoft Azure services guide. • Google Cloud Platform services guide. ]
It would be helpful to have one designated know-it-all who works on the bleeding edge of cloud computing every day who’s not afraid to predict what’s around the next bend in the road. The good news is that I’m not too shy to take on that role. So, here are three cloud trends that will have a profound effect on the cloud community in 2019, although most in the cloud industry don’t seem to see them coming.
Was it just a few years ago when we built our websites by lovingly placing each tag in the file with care and grace? That era of handcrafted websites is long gone. Most modern websites are elaborate programs that are constantly pinging multiple data sources and then churning out a complex confection of tags nested inside of tags nested inside of other tags. It's layers of code built on top of libraries linked to frameworks that call in web services, all to put some words and pictures on the screen.
Two of the favorite choices today for creating these elaborate mechanisms are React and Vue.js, two chunks of code that might be called libraries or frameworks depending upon how you define the words. They are machines for taking your collection of components and turning them into endlessly morphing, instantly reactive displays. Don't say that they're setting the foundation for web sites because the term web app is a better fit.
It’s easy to take advantage of Microsoft Azure cloud resources in ASP.Net Core, Microsoft’s cross-platform, lean, and modular framework for building high-performance web applications. You can use an Azure storage account to store or retrieve data, for example. Such data might include files, blobs, queues, or tables. In this article we’ll look at how we can upload data to Azure Blob storage from an ASP.Net Core application.Create an ASP.Net Core Web API project in Visual Studio
Assuming that you’re running Visual Studio 2017, you can follow the steps outlined below to create an ASP.Net Web API project in Visual Studio 2017.
- In the Visual Studio IDE, click on File > New > Project.
- Select “ASP.Net Core Web Application (.Net Core)” from the list of templates displayed.
- Specify a name for the project.
- Click OK to save the project.
- Select “API” in the “New .Net Core Web Application…” window.
- Uncheck the “Enable Docker Support” checkbox.
- Select “No Authentication” as we won’t be using authentication here.
- Click OK.
This will create a new ASP.Net Core 2.1 Project in Visual Studio 2017.
Microsoft plans to extend IntelliSense code analysis for Python to tools beyond Visual Studio, using its Python Language Server. IntelliSense provides autocompletions for variables, functions, and other symbols that appear as developers type code.
Available as a beta in the July release of the Python extension for Visual Studio, the Python Language Server will be offered later this year as a standalone component for use with tools that support the Language Server Protocol. That protocol lets editing tools and IDEs support multiple languages.
Two leading analysts’ reports, from Intersect360 Research and Hyperion Research, show the high-performance computing market has reached an inflection point. The cloud segment includes Microsoft, Amazon Web Services, and Google.
Intersect360 says high-performance cloud spending by high-performance computing customers grew by 44 percent from 2016 to 2017, to about $1.1 billion—much faster than the growth in the total high-performance computing market, which is still mostly traditional on-premises hardware clusters.
The two related reasons for the faster cloud adoption of high-performance computing are pretty clear to me.
Collette Stumpf is a software designer at Surge.
Successful software projects please customers, streamline processes, or otherwise add value to your business. But how do you ensure that your software project will result in the improvements you are expecting? Will users experience better performance? Will the productivity across all tasks improve as you hoped? Will users be happy with your changes and return to your product again and again as you envisioned?
AI’s rapid evolution is producing an explosion in new types of hardware accelerators for machine learning and deep learning.
Some people refer to this as a “Cambrian explosion,” which is an apt metaphor for the current period of fervent innovation. It refers to the period about 500 million years ago when essentially every biological “body plan” among multicellular animals appeared for the first time. From that point onward, these creatures—ourselves included—fanned out to occupy, exploit, and thoroughly transform every ecological niche on the planet.
A distributed file system, a MapReduce programming framework, and an extended family of tools for processing huge data sets on large clusters of commodity hardware, Hadoop has been synonymous with “big data” for more than a decade. But no technology can hold the spotlight forever.
While Hadoop remains an essential part of the big data platforms, and the major Hadoop vendors—namely Cloudera, Hortonworks, and MapR—have changed their platforms dramatically. Once-peripheral projects like Apache Spark and Apache Kafka have become the new stars, and the focus has turned to other ways to drill into data and extract insight.[ The essentials from InfoWorld: What is Apache Spark? The big data analytics platform explained • Spark tutorial: Get started with Apache Spark • What is data mining? How analytics uncovers insights. | Cut to the key news and issues in cutting-edge enterprise technology with the InfoWorld Daily newsletter. ]
Let’s take a brief tour of the three leading big data platforms, what each adds to the mix of Hadoop technologies to set it apart, and how they are evolving to embrace a new era of containers, Kubernetes, machine learning, and deep learning.