Connect with us

Tech

Zero trust startup Illumio raises $225M to protect multicloud and edge

Published

on

Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.


Datacenter and cloud security company Illumio today announced it has raised $225 million in a series F funding round led by Thoma Bravo, with Franklin Templeton, Hamilton Lane, and Blue Owl Capital also investing in the developer of zero trust security software. Illumio’s latest funding round sets the company’s valuation at $2.8 billion.

Illumio, headquartered in Sunnyvale, California, plans to spend its newfound riches in two main areas, go-to-market activities, and product innovation, CEO and cofounder Andrew Rubin told VentureBeat on Wednesday. Rubin said the new funds would help plans to expand Illumio’s presence in more global markets and to build its partner channel.

The company aims to conduct more joint go-to-market activities with system integrator partners and to begin working with managed security service providers (MSSPs), a solution provider category the company has not yet tapped into, he said.

“We’ve spent a lot of time thinking about how to put these resources to work and go-to-market is definitely bucket No. 1 for us, while innovation is also something we’re always going to do as a technology company. We want to see a significant uptick with our channel, where we want to have a stronger voice, a market-leading voice in zero trust segmentation,” Rubin said.

Founded in 2013, Illumio has gained attention for its approach to zero trust security, which promises real-time detection, mitigation, and containment of cyberthreats before they can spread and take down entire networks. Illumio was listed among the top-rated zero trust software developers in the latest Forrester Zero Trust Wave rankings alongside companies like Akamai Technologies, Cisco, and Palo Alto Networks.

Rubin said Illumio had just closed its strongest-ever quarter and brought on the most new customers per annum in company history in its just-completed fiscal year. The company’s core products are Illumio Core and Illumio Edge, which provide a cloud-native method of segmenting hybrid, multicloud IT estates in the datacenter and at the network edge.

Illumio’s approach is to quickly inoculate workloads and data repositories against cyberattacks to minimize the damage of a breach across applications, containers, clouds, datacenters, and endpoints.

Stop a leak, stop a flood

Illumio holds that the old, detection-based methods of protecting against cyberthreats aren’t enough to mitigate risk in today’s more dangerous and complex security climate. Rubin pointed to a rash of recent high-profile cyberattacks, including Colonial Pipeline, the SolarWinds breach, and JBS USA, as evidence that the danger and impact of such incidents is graver than ever before.

“We’ve had a security model based on layering on more and more levels of detection for decades. But look at the past 180 days — when things go wrong in security, the results are catastrophic. Detection is still a part of a strong security posture, but it’s no longer enough, and these recent breaches are the evidence of it,” Rubin said.

Rather than waiting for a threat to be detected, zero trust security treats networks, systems, and people as if they are always under attack, he said. This means users and systems are continuously required to authenticate themselves and prove they belong in the digital spaces they have access to, like a network or file storage repository, and are allowed to do the activities they are attempting.

Illumio’s segmentation approach to zero trust further bolsters this cybersecurity technique by enabling different parts of a network to be shut off from each other when a contagion like ransomware is trying to spread. Rubin described Illumio’s segmentation methods as similar to the way a submarine has compartments that can be sealed when a leak occurs to prevent the entire vessel from being flooded.

Buy-in at the top

Rubin said decision-makers in IT organizations are prioritizing cybersecurity more than ever, and this extends to the C-suite and the highest levels of government.

“Adopting zero trust strategies has never been more important for organizations across all industries, as the Biden Administration’s recent cybersecurity Executive Order demonstrates,” he said, referring to last month’s executive order setting new software standards for government agencies and creating a cybersecurity review board that will probe major breaches similarly to how the Federal Aviation Administration (FAA) conducts air accident investigations.

“In the past when a breach occurred, we asked, ‘How much data was stolen?’ We were still sort of removed from the real-world impact of these incidents. But now, we’re asking about that real-world impact — for example, will a breach mean fuel shortages and gas lines that affect our daily lives immediately?’ Rubin said.

Illumio has seen a shift in how people think about security during customer calls, he said. CIOs, CISOs, and even CEOs are “finally asking the right questions” about strategic risk mitigation and even driving conversations toward zero trust and “working on the assumption that we’re getting breached even when we do everything right.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

https://www.emultimediatv.com

Tech

Toyota Whiffed on EVs. Now It’s Trying to Slow Their Rise

Published

on

Executives at Toyota had a moment of inspiration when the company first developed the Prius. That moment, apparently, has long since passed.

The Prius was the world’s first mass-produced hybrid car, years ahead of any competitors. The first model, a small sedan, was classic Toyota—a reliable vehicle tailor-made for commuting. After a major redesign in 2004, sales took off. The Prius’ Kammback profile was instantly recognizable, and the car’s combination of fuel economy and practicality was unparalleled. People snapped them up. Even celebrities seeking to burnish their eco-friendly bona fides were smitten with the car. Leonardo DiCaprio appeared at the 2008 Oscars in one.

As the Prius’ hybrid technology was refined over the years, it started appearing in other models, from the small Prius c to the three-row Highlander. Even the company’s luxury brand, Lexus, hybridized several of its cars and SUVs.

For years, Toyota was a leader in eco-friendly vehicles. Its efficient cars and crossovers offset emissions from its larger trucks and SUVs, giving the company a fuel-efficiency edge over some of its competition. By May 2012, Toyota had sold 4 million vehicles in the Prius family worldwide.

The next month, Tesla introduced the Model S, which dethroned Toyota’s hybrid as the leader in green transportation. The new car proved that long-range electric vehicles, while expensive, could be both practical and desirable. Battery advancements promised to slash prices, eventually bringing EVs to price parity with fossil-fuel vehicles.

But Toyota misunderstood what Tesla represented. While Toyota invested in Tesla, it saw the startup not as a threat but rather a bit player that could help Toyota meet its EV mandates. In some ways, that view was justified. For the most part, the two didn’t compete in the same segments, and Toyota’s worldwide volume dwarfed that of the small US manufacturer. Besides, hybrids were just a stopgap until Toyota’s hydrogen fuel cells were ready. At that point, the company thought, hydrogen vehicles’ long range and quick refueling would make EVs obsolete.

Evidently, Toyota didn’t pick up on the subtle shift that was occurring. It’s true that hybrids were a bridge to cleaner fuels, but Toyota was overestimating the length of that bridge. Just as Blackberry dismissed the iPhone, Toyota dismissed Tesla and EVs. Blackberry thought the world would need physical keyboards for many more years. Toyota thought the world would need gasoline for several more decades. Both were wrong.

In tethering itself to hybrids and betting its future on hydrogen, Toyota now finds itself in an uncomfortable position. Governments around the world are moving to ban fossil-fuel vehicles of any kind, and they’re doing so far sooner than Toyota anticipated. With EV prices dropping and charging infrastructure expanding, fuel-cell vehicles are unlikely to be ready in time.

In a bid to protect its investments, Toyota has been strenuously lobbying against battery-powered electric vehicles. But is it already too late?

Hydrogen Dead End

Having spent the last decade ignoring or dismissing EVs, Toyota now finds itself a laggard in an industry that’s swiftly preparing for an electric—not just electrified—transition.

Sales of Toyota’s fuel-cell vehicles haven’t lit the world on fire—the Mirai continues to be a slow seller, even when bundled with thousands of dollars’ worth of hydrogen, and it’s unclear if its winsome but slow redesign will help. Toyota’s forays into EVs have been timid. Initial efforts focused on solid-state batteries that, while lighter and safer than existing lithium-ion batteries, have proven challenging to manufacture cost-effectively, much like fuel cells. Last month, the company announced that it would release more traditional EV models in the coming years, but the first one won’t be available until the end of 2022.

Confronted with a losing hand, Toyota is doing what most large corporations do when they find themselves playing the wrong game—it’s fighting to change the game.

Toyota has been lobbying governments to water down emissions standards or oppose fossil-fuel vehicle phaseouts, according to a New York Times report. In the past four years, Toyota’s political contributions to US politicians and PACs have more than doubled. Those contributions have gotten the company into hot water too. By donating to congresspeople who oppose tighter emissions limits, the company funded lawmakers who objected to certifying the results of the 2020 presidential election. Though Toyota had promised to stop doing so in January, it was caught making donations to the controversial legislators as recently as last month.

https://www.emultimediatv.com

Continue Reading

Tech

OpenAI releases Triton, a programming language for AI workload optimization

Published

on

All the sessions from Transform 2021 are available on-demand now. Watch now.


OpenAI today released Triton, an open source, Python-like programming language that enables researchers to write highly efficient GPU code for AI workloads. Triton makes it possible to reach peak hardware performance with relatively little effort, OpenAI claims, producing code on par with what an expert could achieve in as few as 25 lines.

Deep neural networks have emerged as an important type of AI model, capable of achieving state-of-the-art performance across natural language processing, computer vision, and other domains. The strength of these models lies in their hierarchical structure, which generates a large amount of highly parallelizable work well-suited for multicore hardware like GPUs. Frameworks for general-purpose GPU computing such as CUDA and OpenCL have made the development of high-performance programs easier in recent years. Yet, GPUs remain especially challenging to optimize, in part because their architectures rapidly evolve.

Domain-specific languages and compilers have emerged to address the problem, but these systems tend to be less flexible and slower than the best handwritten compute kernels available in libraries like cuBLAS, cuDNN or TensorRT. Reasoning about all these factors can be challenging even for seasoned programmers. The purpose of Triton, then, is to automate these optimizations, so that developers can focus on the high-level logic of their code.

“Novel research ideas in the field of deep learning are generally implemented using a combination of native framework operators … [W]riting specialized GPU kernels [can improve performance,] but [is often] surprisingly difficult due to the many intricacies of GPU programming. And although a variety of systems have recently emerged to make this process easier, we have found them to be either too verbose, lack flexibility, generate code noticeably slower than our hand-tuned baselines,” Philippe Tillet, Triton’s original creator, who now works at OpenAI as a member of the technical staff, wrote in a blog post. “Our researchers have already used [Triton] to produce kernels that are up to 2 times more efficient than equivalent Torch implementations, and we’re excited to work with the community to make GPU programming more accessible to everyone.”

Simplifying code

According to OpenAI, Triton — which has its origins in a 2019 paper submitted to the International Workshop on Machine Learning and Programming Languages — simplifies the development of specialized kernels that can be much faster than those in general-purpose libraries. Its compiler simiplifies code and automatically optimizes and parallelizes it, converting it into code for execution on recent Nvidia GPUs. (CPUs and AMD GPUs and platforms other than Linux aren’t currently supported.)

“The main challenge posed by our proposed paradigm is that of work scheduling — i.e., how the work done by each program instance should be partitioned for efficient execution on modern GPUs,” Tillet explains in Triton’s documentation website. “To address this issue, the Triton compiler makes heavy use of block-level data-flow analysis, a technique for scheduling iteration blocks statically based on the control- and data-flow structure of the target program. The resulting system actually works surprisingly well: our compiler manages to apply a broad range of interesting optimization automatically.”

The first stable version of Triton, along with tutorials, is available from the project’s GitHub repository.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

https://www.emultimediatv.com

Continue Reading

Tech

Astronomers have spotted x-rays from behind a supermassive black hole

Published

on

“This is a really exciting result,” says Edward Cackett, an astronomer at Wayne State University who was not involved with the study. “Although we have seen the signature of x-ray echoes before, until now it has not been possible to separate out the echo that comes from behind the black hole and gets bent around into our line of sight. It will allow for better mapping of how things fall into black holes and how black holes bend the space time around them.”

The release of energy by black holes, sometimes in the form of x-rays, is an absurdly extreme process. And because supermassive black holes release so much energy, they are essentially powerhouses that allow galaxies to grow around them. “If you want to understand how galaxies form, you really need to understand these processes outside the black hole that are able to release these enormous amounts of energy and power, these amazingly bright light sources that we’re studying,” says Dan Wilkins, an astrophysicist at Stanford University and the lead author of the study. 

The study focuses on a supermassive black hole at the center of a galaxy called I Zwicky 1 (I Zw 1 for short), around 100 million light-years from Earth. In supermassive black holes like I Zw 1’s, large amounts of gas fall toward the center (the event horizon, which is basically the point of no return) and tend to flatten out into a disk. Above the black hole, a confluence of supercharged particles and magnetic field activity results in the production of high-energy x-rays.

Some of these x-rays are shining straight at us, and we can observe them normally, using telescopes. But some of them also shine down toward the flat disk of gas and will reflect off it. I Zw 1 black hole’s rotation is slowing down at a higher rate than that seen in most supermassive black holes, which causes surrounding gas and dust to fall in more easily  and feed the black hole from multiple directions. This, in turn, leads to greater x-ray emissions, which is why Wilkins and his team were especially interested.

While Wilkins and his team were observing this black hole, they noticed that the corona appeared to be “flashing.” These flashes, caused by x-ray pulses reflecting off the massive disk of gas, were coming from behind the black hole’s shadow—a place that is normally hidden from view. But because the black hole bends the space around it, the x-ray reflections are also bent around it, which means we can spot them.

The signals were found using two different space-based telescopes optimized to detect x-rays in space: NuSTAR, which is run by NASA, and XMM-Newton, which is run by the European Space Agency.

The biggest implication of the new findings is that they confirm what Albert Einstein predicted in 1963 as part of his theory of general relativity—the way light ought to bend around gargantuan objects like supermassive black holes. 

“It’s the first time we really see the direct signature of the way light bends all the way behind the black hole into our line of sight, because of the way black hole warps space around itself,” says Wilkins. 

“While this observation doesn’t change our general picture of black hole accretion, it is a nice confirmation that general relativity is at play in these systems,” says Erin Kara, an astrophysicist at MIT who was not involved with the study.

Despite the name, supermassive black holes are so far away that they really just look like single points of light, even with state-of-the-art instruments. It’s not going to be possible to take images of all of them the way scientists used the Event Horizon Telescope to capture the shadow of a supermassive black hole in galaxy M87. 

So although it’s early, Wilkins and his team are hopeful that detecting and studying more of these x-ray echoes from behind the bend could help us create partial or even full pictures of distant supermassive black holes. In turn, that could help them unlock some big mysteries around how supermassive black holes grow, sustain entire galaxies, and create environments where the laws of physics are pushed to the limit.  

https://www.emultimediatv.com

Continue Reading

Trending

Copyright © 2021 Emultimediatv.