Ubuntu

The launch of Ubuntu & # 39; Bionic Beaver & # 39; 18.04 is important. Not only is the LTS, with five years of support, which will make millions of users install Ubuntu for the first time with GNOME firmly located in the desktop environment slot, but it could be the release you see Canonical [19659002] the company behind Ubuntu, through IPO. We spoke with Will Cooke, desktop director at Canonical, and David Bitton, engineering manager at Ubuntu Server, about the overall goals for Ubuntu 18.04 LTS and future plans.

IT WILL WORK: So we're on another LTS release, which comes with five years of support. And that's important for our typical user base, because they do not want to have to … well, they want to be safe knowing that the platform they're working on, and that they trust, is going to be secure and up-to-date, and it will stay running for a long time.

Usually, we find that most of our users like to install it once, and then leave it alone, and they know it will be checked after itself. That's more important in the cloud environment than in the desktop, maybe. But the joy of Ubuntu is that the packages you run on your desktop, let's say you're a web developer, and you want to run an instance of Apache and an instance of MySQL, and you want to have your developer tools there. You can do all that development on your machine, and then implement it in the cloud, running the same version of Ubuntu, and be sure that the packages that are installed on your desktop are exactly the same as those in your enterprise installation.

And the fact that they are admitted for five years means you do not have to keep upgrading your machines. And when you have thousands of machines deployed in the cloud in some way, the last thing you want to do is keep them every year and update them, and deal with all the consequences that occur there.

The general theme for Ubuntu at 18.04 is the ability to locally develop and implement, either the public cloud, your private cloud, whatever you want to do, your servers. But also edge devices, too.

So we have made many advances in our Ubuntu Core products, which is a really small and small version of Ubuntu, which changes with the minimum necessary to carry an active device and get it on the network.

And so, the packages that you can implement in your service, on your Desktop, can also be implemented in IoT devices, in peripheral devices, in your network switches: you know, in general. And that gives you unrivaled capacity and reliability to know that the things you're working on can be packaged, sent to other devices and will continue to work the same way it works on your desktop, as it does on all of these other devices.

And a key player in that story are the instant packages we've been working on. These are stand-alone binaries that work not only in Ubuntu, but also in Fedora or CentOS or Arch.

So as an application developer, for example, […] you can group all those dependencies in a self-contained package, and then send it to your various devices. And you know what will work, whether they run Ubuntu or not.

That's a really powerful message for developers: do your work in Ubuntu; package it; and extract it to any device that runs Linux, and you can trust it and continue working for the next five years.

Ubuntu 18.04 has a strong focus on snapshots. This is a new package format, which allows application developers to group their software, with all the dependencies included, in a secure, isolated space container that runs on Ubuntu Linux (and other distributions compatible with Linux, such as Solus) . This has caused many high profile software products, including Slack and Skype, to appear in the Snap Store in time for the new version.

What is the common problem that developers have with the DEB and the RPM that led to the development of the snapshot format?

WC: There are some. The packaging DEB, or RPM, for that matter, is a bit of black art. There is a certain amount of magic involved in that. And the learning process to go through it, to understand how to properly package something like DEB or RPM: the entry barrier is quite high, there. So snapshots simplify a lot of that.

Once again, part of the fact, really, is this ability to group all dependencies with it. If you package your application and say, "Okay, I depend on this version of this library for this architecture," then the dependency resolution could take care of that. It would probably be like that.

But as soon as your underlying operating system changes that library, for example, your package breaks down. And you can never be sure where that package will be deployed, and what version of what operating system it will end up with.

So by grouping all of that together in an instant, then you are absolutely sure that all of your dependencies are sent along with your application. Then, when it reaches the other end, it will open and run correctly.

The other key characteristic, in my opinion, of the snapshots, is the aspect of security confinement. X. Org, for example, is now a bit long on the tooth. It was never really designed taking into account secure computing. So it's pretty easy, well, not necessarily X. Org really, but the entire operating system; if something is running as a root, or it is running as its user, then it has the permissions of that user who is executing it.

Then you can install an application where the developer, for example, could enter your home directory, enter your SSH key directory, make a copy of them and send them by email somewhere. It will do so with the same permissions as the user who executes it. And yes, that is a real concern.

With adjustments and confinements, you can say: "This application, this snapshot, does not have access to those things". You will not be able to physically read those files on the disk. . They do not exist in what refers.

Then, from a user's perspective, you can download this new application because you heard about it on the Internet. You do not know what it is, you do not know where it comes from, but you can install it and run it, safe knowing that it will not be able to simply pass over your disk and have a look through all these files for which you do not necessarily want to have access .

So, in my opinion, they are the two key stories. The writing side of once on either side of things, and then the security aspect of confinement as well.

Leave a Reply