Glosario KW | KW Glossary
Especial | A | B | C | D | E | F | G | H | I | J | K | L | M | N | Ñ | O | P | Q | R | S | T | U | V | W | X | Y | Z | TODAS
U (DATA CENTER)
U (OPEN SOURCE)
U (WEB SERVICES)
Posted by Margaret Rouse
Uncloud is the removal of applications and data from a cloud computing platform.
In recent years, organizations ranging from small and medium-sized businesses to large enterprises have turned to the cloud to run applications, store data and accomplish other IT tasks. Over time, however, an organization may elect to uncloud one, a few or, possibly, all of its cloud-based assets. Examples could include shutting down a server instance in a public cloud and moving the associated software and data to an in-house data center or colocation facility. De-cloud is another term used to describe this reverse cloud migration.
In the process of unclouding, the cloud customer or, potentially, a channel partneracting on its behalf, will work with the cloud vendor to extract the customer's applications and data. The task involves locating the data and mapping the application's dependencies within the cloud vendor's infrastructure. The unclouding customer -- and its channel partner -- may encounter higher levels of complexity in the case of a public multi-tenant cloud setting. A customer may have to wait for the cloud vendor's scheduled downtime to migrate its applications and data or the cloud provider may limit the customer's use of migration tools so as not to interfere with the application performance of other customers
Customers may cite a number of reasons for wanting to uncloud. Factors include security issues, liability concerns and difficulty in integrating cloud-based applications with on-premises enterprise applications and data. Frustrated expectations with respect to the cloud's cost efficiency may also influence de-clouding decisions. Anecdotal evidence suggests that customers citing cost as a factor may elect to move applications to an in-house, hyper-converged infrastructure as the better economic choice.
Reverse migration on the rise: Channel partners see customers uncloud
by John Moore
Channel partners report that a small but increasing number of customers are moving some or all of their applications off the cloud.
Channel partners say some of their customers have begun to uncloud and are asking for help migrating back to in-house data centers or colocation facilities.
While cloud computing, in general, remains a high growth area, a counter trend of reverse migration has started to surface. Organizations, industry executives said, cite a number of reasons for moving some or all of their applications off the cloud: security and compliance concerns, frustration over elusive cost savings, and the changing data center economics of hyper-converged architecture.in
At Trace3 Inc., cost and hyper-convergence played key roles in one customer's off-the-cloud migration. Trace3, based in Irvine, Calif., focuses on data center, big data and cloud technologies. Mark Campbell, research principal and director of Innovation Research at Trace3, said a retail client recently completed a "back-sourcing" exercise in which it migrated its entire cloud footprint into a colocation data center, where the retailer could control the infrastructure.
"Cost was the primary driver," Campbell said of the retail customer. "They were estimating they could save 40% over their cloud IaaS [infrastructure as a service] and PaaS [platform as a service] provider by building their own private cloud built on hyper-converged and commodity infrastructure," he said.
Campbell noted that he hasn't had the opportunity to follow up with the company to see whether it actually realized the projected savings.
Getting off the cloud
Nevertheless, other Trace3 customers have taken steps to uncloud, a pattern Campbell began noticing last year. He said a few customers -- numbered in the dozens, out of a client base of some 2,000 companies -- have encountered issues in the cloud.
"The vast majority of our customers have moved at least some of their enterprise applications to the cloud, and the vast majority of those are continuing in the cloud," Campbell said. "There is a small minority, however, that are moving some or all of their applications back into their own data centers or colocation sites more under their control."
"We are exiting the honeymoon stage, and that is always a rude awakening -- and expectation readjustment -- for both parties."
Paul Dippell CE, Service Leadership
Irwin Teodoro, senior director of data center transformation atDatalink Corp., a data center services provider based in Eden Prairie, Minn., has also observed declouding among his company's customers. He said for every 10 companies pursuing some form of cloud computing, he has seen two or three looking to get out of the cloud.
"This is definitely a trend we are going to see more of."
The counter-cloud migration may signal a resetting of expectations among channel partner customers. Paul Dippell, CEO of Service Leadership Inc., a company based in Plano, Texas, that provides a financial and operational benchmark for channel companies, said cloud vendors tell customers that their offerings are "wonderful, weightless, agile, low cost, mobile [and] fantastically free of the impediments of past computing models."
But that vision doesn't always line up with reality.
"What the customers are experiencing is different enough that a material number of customers are declouding or significantly changing -- reducing -- their cloud strategies to regain a more solid computing foundation and rational cost," Dippell said.
"I don't expect cloud to fail, by any means, and I do expect it to grow," Dippell said. "But we are exiting the honeymoon stage, and that is always a rude awakening -- and expectation readjustment -- for both parties."
Dippell added that he's heard anecdotal accounts of solution providers winning new customers by agreeing to decloud them.
When customers uncloud: Top factors
A number of factors influence migration decisions. Unforeseen security issues, for example, may drive some applications back in-house. In general, risk and liability concerns are tempering enthusiasm for the cloud, said Dan Liutikas, managing attorney at InfoTech Law Advocates P.C., and chief legal officer and secretary at CompTIA.
Channel partners, as well as customers, are questioning whether cloud is the correct answer for every customer. While the cloud adoption wave continues, more and more service providers are weighing whether cloud is the right approach for a particular customer or a subset of customers, Liutikas said. The latter includes companies in highly regulated industriessuch as healthcare and financial services.
"Sometimes … on-premises is the better answer based on their customers' needs," he said.
Organizations may also struggle to achieve deep integration between their cloud applications and their on-premises legacy applications and data, according to industry executives. But beyond legal and technical hurdles,cost has become a sticking point for some cloud users.
Unexpected cloud costs may stem from a customer's failure to quantify all the necessary services in its initial calculations. Campbell said most customers tend to be accurate in estimating traditional infrastructure and capacity costs for servers, storage capacity and bandwidth, among other components. But on the other hand, they tend to underestimate the cost of items beyond their data centers. Those items include the cost of creating multiple storage snapshots to back up data, the cost of data replication and the cost of restoring data.
"This leads to budgetary surprises," Campbell said.
Cloud sprawl can also stress budgets.
"Much like [virtual machine] sprawl, it is not uncommon for the initial targets of a cloud installation to grow as both the IT and business discover new applications, features and snap-of-the-fingers capacity bursts," Campbell explained. "These all add line items to the monthly bill."
In addition, cloud offerings may run afoul of conventional budget controls.
Campbell said traditional IT organizations built their financial processes and controls to monitor big-ticket items such as projects and large Capexpurchases and smaller items such as consumables and onetime Opexexpenditures handled on an approval basis.
"This works great in a data-center-centric operation, but imagine the befuddled expression on the comptroller's face when he gets his first 23,000-line-item bill from Amazon," Campbell said. "It is very hard to even decipher what these expenditures are for, let alone garner business justification."
Customers disappointed with cloud cost savings may end up migrating applications to hyper-converged infrastructures.
"Some [companies] are pulling in applications from the cloud to their data centers," Campbell said. "If they do that, we are seeing hyper-convergenceas being one of those enabling mechanisms."
In addition to cost, corporate culture can play a role in a cloud reversal.
"Executives who are not fully aware of the concepts of the cloud feel somewhat apprehensive that data is somewhere else and feel lack of control," Teodoro said.
Managing the declouding challenge
Assisting customers as they back out of the cloud can prove difficult. Teodoro said public clouds in which multiple customers share a common infrastructure represent the greatest challenge. Dealing with maintenance windows is one issue. A customer can't just extract an application based on its own ad hoc maintenance timetable; they have to wait for the cloud provider's scheduled downtime.
"You can't move when you want to move," Teodoro said. "You've got to move at somebody else's pace and schedule."
Determining a cloud-based application's dependencies with respect to the cloud provider's infrastructure is another consideration. A channel partner working on an off-the-cloud migration project needs to figure out what virtual machines the application resides on and identify the virtual LANs and subnets in the compute infrastructure to which the application can be traced, Teodoro explained. The goal: extract the application without breaking something in the environment.
"The keys for us are really to understand the dependencies in the environment -- down to the infrastructure -- and find ways to carve out the environment into smaller chunks or workgroups," Teodoro said.
Another complication: Migration tools can help channel companies uncloud customers, automating the tasks of data gathering, analysis and forensics. But in a shared, multi-tenant cloud, service providers can't use their own tools, since they could impact a cloud provider's other clients, Teodoro said.
Seeking a happy balance
Customers juggling multiple IT environments provide yet another degree of difficulty. Jim Piazza, vice president of service management at CenturyLink Inc., which offers colocation, public cloud and IT services, said customers such as software as a service providers may offer multiple versions of their software to support different clients. And those different versions may be hosted on different computing platforms: in-house private clouds, colocation centers and public clouds, for example.
"It's an interesting mix … that is really quite a challenge to manage," Piazza said.
Piazza said CenturyLink, based in Monroe, La., provides customers a service catalog to help them keep track of what version of their software is deployed where. In addition, the company has built interconnects between customers' colocation footprints in CenturyLink facilities and CenturyLink's public cloud. The service catalog and interconnects enable the company's clients to move their end customers from one platform to another, Piazza said.
Piazza likened migrating customers and their workloads among the various platforms to supporting a 3D jigsaw puzzle.
For Campbell, the cloud conundrum boils down to harmonizing the computing platforms now available to customers.
"It's finding that happy balance -- what lives best in the cloud and what lives best in-house."
Continue Reading About uncloud (de-cloud)
Dig Deeper on Cloud integration services and cloud enablement services
User Interface (UI)
Also see human-computer interaction.
In information technology, the user interface (UI) is everything designed into an information device with which a human being may interact -- including display screen, keyboard, mouse, light pen, the appearance of a desktop, illuminated characters, help messages, and how an application program or a Web site invites interaction and responds to it. In early computers, there was very little user interface except for a few buttons at an operator's console. The user interface was largely in the form of punched card input and report output.
Later, a user was provided the ability to interact with a computer online and the user interface was a nearly blank display screen with a command line, a keyboard, and a set of commands and computer responses that were exchanged. This command line interface led to one in which menus (list of choices written in text) predominated. And, finally, the graphical user interface (GUI) arrived, originating mainly in Xerox's Palo Alto Research Center, adopted and enhanced by Apple Computer, and finally effectively standardized by Microsoft in its Windows operating systems.
The user interface can arguably include the total "user experience," which may include the aesthetic appearance of the device, response time, and the content that is presented to the user within the context of the user interface.
Contributor(s): Mike Dang
This was last updated in April 2005
Posted by: Margaret Rouse
Usuario Final / End User
Usuario Final es el operador del sistema. Hay diferentes perfiles de Usuarios, que van desde el Supervisor hasta el que solamente utiliza ciertas funcionalidades acotadas y puntuales.
UTM vs. NGFW: Unique products or advertising semantics?
UTM vs. NGFW: Unique products or advertising semantics?
by: Michael Heller
In comparing UTM vs. NGFW, organizations find it difficult to see if there are differences between the two products or if it is just marketing semantics.
It can often be difficult to discern the difference between unified threat management (UTM) and next-generation firewalls (NGFW). Experts agree that the lines appear to be blurring between the two product sets, but enterprises that focus on defining each product type during the purchasing process may be making a mistake.
"Service providers for ISPs have different needs than enterprises," said Young. "So, UTM vendors will only offer basic firewall features as a price-play for that market."
Young said those differences in ease of use and support demands still exist today, though they have become more nuanced; there is overlap in the underlying technology of NGFW and UTM, and spec sheets tend to look similar. Young said that the key differences now are more around quality of features, and the level of support from channel partners to meet customer needs.
Young also noted that vendors tend to excel in one market or the other, like Fortinet Inc. with UTM for SMBs, or Palo Alto Networks Inc. with NGFW for enterprises. Few vendors can succeed in both, he said, like Check Point Software Technologies Ltd. has done.
"The confusion came from SMB vendors trying to move into the enterprise market without making channel and quality changes," said Young. "It was an intentional campaign to confuse, but very few end users are confused about what they need. It is either a racecar [NGFW] or a family van [UTM]."
Brazil admitted that the differences between NGFW and UTM can be confusing, even for experienced practitioners, but described UTM as a collection of unrelated security features, one of which is the firewall.
"UTM generally refers to a firewall with a mix of other 'bolted-on' security functions like antivirus and even email spam protection," said Brazil. "These are not access control features that typically define a firewall."
What traditionally has defined next-gen firewalls, Brazil said, is robust Layer 7 application access control, though an increasing number of NGFWs are being augmented with integrated threat intelligence, enabling them to deny known threats based on a broad variety of automatically updated policy definitions.
However, Brazil did caveat his distinctions by saying that a UTM could be considered an NGFW if it met the Layer 7 parameters, and an NGFW that included malware functions could be considered a UTM. Though, he was clear that despite these potential overlap points, he would keep the classifications separate because of a lack of similarities in other respects, like access control.
Brazil said that NGFW will eventually become the standard, and the terms NGFW and firewall will become synonymous. He said UTM will remain an important product for SMBs, especially when a company prioritizes simplicity of deployment over the depth of security and performance, but NGFW and UTM will not converge because of performance and management concerns.
"The idea of a 'converged' network security gateway will continue to have appeal, so vendors will continue to add functionality to reduce cost of firewall ownership to the customer and increase revenue to the vendor," said Brazil. "However, issues with performance and manageability will continue to force separate, purpose-built systems that will be deployed in enterprise networks. As such, there will continue to be enterprise firewalls that should not be considered UTMs."
Mike Rothman, analyst and president for Phoenix-based security firm Securosis LLC., said he believes that UTM and NGFW are essentially the same, and the differences are little more than marketing semantics. Rothman agreed that marketing from vendors caused confusion, but also blamed analysts for adopting the term NGFW and driving it into the vernacular.
He said that early UTMs did have problems scaling performance from SMBs to larger enterprises, especially when trying to enforce both positive rules (firewall access) and negative rules (IPS), but that early NGFW had the same issues keeping up with wire speed when implementing threat prevention. He said that the perceived disparities were used to enforce market differentiation, and persist today, despite these scaling issues not being relevant anymore.
According to Rothman, the confusion lies not only in comparing the two device types, but also in the term "next-generation firewall" itself, which he thinks minimizes what the device does.
"What an NGFW does is bigger than just a firewall," said Rothman. "A firewall is about access control, basically enforcing what applications, ports, protocols, users, etc., are allowed to pass through the firewall. The NGFW also can look for and deny access to threats, like an IPS. So it's irritating that the device is called an NGFW, as it does more than just a firewall. We call it the Network Security Gateway, as that is a more descriptive term."
Rothman said that today's UTMs can do everything an NGFW can do, as long as they are configured properly and have the right policy integration. He said he believes that arguments about feature sets or target markets are examples of aritificial distinctions that only serve to confuse the issue.
"From a customer perspective, the devices do the same thing," Rothman said. "The NGFW does both access control and threat prevention, as does the UTM, just a little differently in some devices. Ultimately, the industry needs to focus on what's important: Will the device scale to the traffic volumes they need to handle with all of the services turned on? That's the only question that matters."
Moving forward, despite differences in opinions, the experts agree that enterprises shouldn't go into a purchasing process by trying to decide whether they need an NGFW or a UTM. Rather, the ultimate goal should always be to focus on the best product to solve their problems.
Rothman said that the distinctions will go away as low-end UTM vendors add more application-inspection capabilities and more traditional NGFW vendors go downmarket by offering versions suitable for SMBs. He also said he doesn't expect an end to confusing vendor marketing anytime soon, so enterprises need to be careful to ignore these semantics and focus on finding the right product to address security needs.
Young said that in the short term, UTM and NGFW will remain separate and will both continue to be mainstays for SMBs and larger enterprises respectively, and the decision around what device to use will be a question of need.
The question of UTM vs. NGFW is still divisive, and experts have different ideas regarding if and where the two technologies diverge when looking at the issue from a vendor perspective. However, when looking at the issue from a customer perspective, the experts agree that focusing on an enterprise's security needs will help to mitigate the confusion and lead to the right product.
"It isn't just about technology, it is about how small company's security is different than a big company's security," said Young. "It's all about the use case, not a 'versus.'"