Glosario KW | KW Glossary


Ontology Design | Diseño de Ontologías

Browse the glossary using this index

Special | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | ALL

U

Picture of System Administrator

U (BUSINESS)

by System Administrator - Thursday, 2 May 2013, 9:53 PM
 
Picture of System Administrator

U (DATA CENTER)

by System Administrator - Thursday, 11 July 2013, 5:26 PM
 
Picture of System Administrator

U (ICT/TIC)

by System Administrator - Thursday, 2 May 2013, 8:48 PM
 
Picture of System Administrator

U (MARKETING)

by System Administrator - Thursday, 2 May 2013, 10:35 PM
 

 

U

UbicuidadUMTSUnica visitaUploadURLURL PersonalizadaUsabilidadUSENETUseneteroUsernameUTCUtilidad.

 

=> Ubicuidad: Característica de un sitio web consistente en estar disponible en Internet y poder ser localizado y visto su contenido. Consta de dos partes, la capacidad de poder ser encontrado o "buscabilidad" y el poder ser visto o "visibilidad".

=> UMTS: Acrónimo de las palabras inglesas: Universal Mobile Telecommunications System. Es decir, sistema universal de telecomunicaciones móviles. Propuesta europea para lograr un estándar internacional en la tercera generación de los sistemas de telefonía móvil. Más información en:http://www.umts-forum.org/servlet/dycon/ztumts/umts/Live/en/umts/Home.

=> Unica visita: Dirección IP correspondiente a un usuario que entra a un sitio web en un día (u otro periodo de tiempo especificado). Los visitantes únicos constituyen la audiencia de un sittio web en un plazo de tiempo dado.

=> Upload: Ascenso de archivos, transferencia de ficheros entre el ordenador donde se confecciona una página web hasta el ftp del servidor donde se hospeda en Internet.

=> URL: Acrónimo de las palabras inglesas: Uniform Resources Locator que, en español, significan: "localizador uniforme de recursos". Es la dirección en Intenet. Por ejemplo, en la siguiente dirección: "http://www.domain.com/dir/subdir/file.html" , el conjunto sería la URL. "http", indica el nombre del protocolo usado. "www" el nombre del servidor, "dir" es un directorio, "subdir" un subdirectorio y "file" el nombre de un archivo.

=> URL Personalizada: URL personalizada: Dirección en una Red Social que apunta directamente al perfil de un usuario en la misma. Por ejemplo: “http//www.facebook.com/mar.monsoriu”. Sirve para mejorar la localización del perfi por medio de buscadores como Google y también dentro de una Red Social para ser encontrado con mayor facilidad por parte de amigos, contactos profesionales y seguidores en caso de tenerlos. En inglés: Vanity urls.

=> Usabilidad: Característica resultante de la suma de la utilidad, facilidad de uso y satisfacción percibidas por parte de los usuarios que visitan un sitio web. El análisis de la misma, forma parte de una de las áreas que se estudian en una Auditoría Web.

=> USENET: Organización que arbitró, en 1979, la creación de los grupos de noticias, más conocidos por los internautas veteranos como news por influencia de la acepción en inglés: "newsgroups" El nombre de Usenet le viene de: "USEr NETwork", o Red de Usuario y se inició entre dos Universidades de Carolina del Norte, en Estados Unidos. Más información en: http://usenet-addresses.mit.edu/.

=> Usenetero: Internauta que participa activamente en los grupos de noticias tanto para ayudar a otros usuarios respondiendo a sus preguntas, como para encontrar información o pedir ayuda ante las dudas que le vayan surgiendo en su quehacer personal o profesional. En la práctica es lo contrario a un lurker.

=> Username: En español, "nombre de usuario". Es decir, el nombre por el que un usuario decide identificarse ante cualquier programa, web, base de datos, cuenta de FTP o lo que sea en Internet. Por ejemplo, una internauta llamada Luzdivina Méndez, como username perfectamente podría tener algo así como: "lmendez".

=> UTC: Siglas de las palabras inglesas: Universal Time Coordinated. En español: "hora universal coordinada". Denominación de la hora de Greenwich.

=> Utilidad: Función concreta de tipo informático que supone un atractivo para la clientela online. Los portales web son un acopio de utilidades. No es lo mismo que servicio. Un servicio, o más técnicamente un "servicio de valor añadido" sería, por ejemplo, el envío de un Ezine a nuestro público objetivo bajo las condiciones que estimemos más oportuno. Una utilidad es permitir que, vía web, los usuarios puede acceder a una red de IRC.

Picture of System Administrator

U (OPEN SOURCE)

by System Administrator - Wednesday, 10 July 2013, 7:44 PM
 
Picture of System Administrator

U (PMI)

by System Administrator - Thursday, 9 May 2013, 2:53 AM
 

--------------------------------------------------

  • Usabilidad: Es la medida en que un elemento tiene la capacidad de ser utilizado, o es conveniente y práctico de usar.
Picture of System Administrator

U (PROGRAMMING)

by System Administrator - Thursday, 11 July 2013, 6:26 PM
 
Picture of System Administrator

U (STORAGE)

by System Administrator - Friday, 31 May 2013, 11:18 PM
 
Picture of System Administrator

U (WEB SERVICES)

by System Administrator - Saturday, 1 June 2013, 3:23 PM
 
  • UDDI (Universal Description, Discovery, and Integration) - an XML-based registry for businesses worldwide to list themselves on the Internet.
Picture of System Administrator

UID

by System Administrator - Tuesday, 30 April 2013, 1:38 PM
 

UIDUnique IDentifier.

The UID can contain an OID (Object IDentifier - Object Identifier) and optional extension. These definitions are often confused. The use of UIDs is the basis for interoperability, where is necessary that same objects have the same identifiers. The extension has a reason when the OID only indicates the master table  where the extra value will be validate.

Picture of System Administrator

Uncloud (de-cloud)

by System Administrator - Wednesday, 11 November 2015, 4:06 PM
 

Uncloud (de-cloud)

Posted by Margaret Rouse

Uncloud is the removal of applications and data from a cloud computing platform.

In recent years, organizations ranging from small and medium-sized businesses to large enterprises have turned to the cloud to run applications, store data and accomplish other IT tasks. Over time, however, an organization may elect to uncloud one, a few or, possibly, all of its cloud-based assets. Examples could include shutting down a server instance in a public cloud and moving the associated software and data to an in-house data center or colocation facility. De-cloud is another term used to describe this reverse cloud migration.

In the process of unclouding, the cloud customer or, potentially, a channel partneracting on its behalf, will work with the cloud vendor to extract the customer's applications and data. The task involves locating the data and mapping the application's dependencies within the cloud vendor's infrastructure. The unclouding customer -- and its channel partner -- may encounter higher levels of complexity in the case of a public multi-tenant cloud setting. A customer  may have to wait for the cloud vendor's scheduled downtime to migrate its applications and data or the cloud provider may limit the customer's use of migration tools so as not to interfere with the application performance of other customers

Customers may cite a number of reasons for wanting to uncloud. Factors include security issues, liability concerns and difficulty in integrating cloud-based applications with on-premises enterprise applications and data. Frustrated expectations with respect to the cloud's cost efficiency may also influence de-clouding decisions. Anecdotal evidence suggests that customers citing cost as a factor may elect to move applications to an in-house, hyper-converged infrastructure as the better economic choice.

Reverse migration on the rise: Channel partners see customers uncloud

by John Moore

Channel partners report that a small but increasing number of customers are moving some or all of their applications off the cloud.

Channel partners say some of their customers have begun to uncloud and are asking for help migrating back to in-house data centers or colocation facilities.

While cloud computing, in general, remains a high growth area, a counter trend of reverse migration has started to surface. Organizations, industry executives said, cite a number of reasons for moving some or all of their applications off the cloud: security and compliance concerns, frustration over elusive cost savings, and the changing data center economics of hyper-converged architecture.in

At Trace3 Inc., cost and hyper-convergence played key roles in one customer's off-the-cloud migration. Trace3, based in Irvine, Calif., focuses on data center, big data and cloud technologies. Mark Campbell, research principal and director of Innovation Research at Trace3, said a retail client recently completed a "back-sourcing" exercise in which it migrated its entire cloud footprint into a colocation data center, where the retailer could control the infrastructure.

"Cost was the primary driver," Campbell said of the retail customer. "They were estimating they could save 40% over their cloud IaaS [infrastructure as a service] and PaaS [platform as a service] provider by building their own private cloud built on hyper-converged and commodity infrastructure," he said.

Campbell noted that he hasn't had the opportunity to follow up with the company to see whether it actually realized the projected savings.

Getting off the cloud

Nevertheless, other Trace3 customers have taken steps to uncloud, a pattern Campbell began noticing last year. He said a few customers -- numbered in the dozens, out of a client base of some 2,000 companies -- have encountered issues in the cloud.

"The vast majority of our customers have moved at least some of their enterprise applications to the cloud, and the vast majority of those are continuing in the cloud," Campbell said. "There is a small minority, however, that are moving some or all of their applications back into their own data centers or colocation sites more under their control."

"We are exiting the honeymoon stage, and that is always a rude awakening -- and expectation readjustment -- for both parties."

Paul Dippell CE, Service Leadership

Irwin Teodoro, senior director of data center transformation atDatalink Corp., a data center services provider based in Eden Prairie, Minn., has also observed declouding among his company's customers. He said for every 10 companies pursuing some form of cloud computing, he has seen two or three looking to get out of the cloud.

"This is definitely a trend we are going to see more of."

The counter-cloud migration may signal a resetting of expectations among channel partner customers. Paul Dippell, CEO of Service Leadership Inc., a company based in Plano, Texas, that provides a financial and operational benchmark for channel companies, said cloud vendors tell customers that their offerings are "wonderful, weightless, agile, low cost, mobile [and] fantastically free of the impediments of past computing models."

But that vision doesn't always line up with reality.

"What the customers are experiencing is different enough that a material number of customers are declouding or significantly changing -- reducing -- their cloud strategies to regain a more solid computing foundation and rational cost," Dippell said.

"I don't expect cloud to fail, by any means, and I do expect it to grow," Dippell said. "But we are exiting the honeymoon stage, and that is always a rude awakening -- and expectation readjustment -- for both parties."

Dippell added that he's heard anecdotal accounts of solution providers winning new customers by agreeing to decloud them.

When customers uncloud: Top factors

A number of factors influence migration decisions. Unforeseen security issues, for example, may drive some applications back in-house. In general, risk and liability concerns are tempering enthusiasm for the cloud, said Dan Liutikas, managing attorney at InfoTech Law Advocates P.C., and chief legal officer and secretary at CompTIA.

Channel partners, as well as customers, are questioning whether cloud is the correct answer for every customer. While the cloud adoption wave continues, more and more service providers are weighing whether cloud is the right approach for a particular customer or a subset of customers, Liutikas said. The latter includes companies in highly regulated industriessuch as healthcare and financial services.

"Sometimes … on-premises is the better answer based on their customers' needs," he said.

Organizations may also struggle to achieve deep integration between their cloud applications and their on-premises legacy applications and data, according to industry executives. But beyond legal and technical hurdles,cost has become a sticking point for some cloud users.

Unexpected cloud costs may stem from a customer's failure to quantify all the necessary services in its initial calculations. Campbell said most customers tend to be accurate in estimating traditional infrastructure and capacity costs for servers, storage capacity and bandwidth, among other components. But on the other hand, they tend to underestimate the cost of items beyond their data centers. Those items include the cost of creating multiple storage snapshots to back up data, the cost of data replication and the cost of restoring data.

"This leads to budgetary surprises," Campbell said.

Cloud sprawl can also stress budgets.

"Much like [virtual machine] sprawl, it is not uncommon for the initial targets of a cloud installation to grow as both the IT and business discover new applications, features and snap-of-the-fingers capacity bursts," Campbell explained. "These all add line items to the monthly bill."

In addition, cloud offerings may run afoul of conventional budget controls.

Campbell said traditional IT organizations built their financial processes and controls to monitor big-ticket items such as projects and large Capexpurchases and smaller items such as consumables and onetime Opexexpenditures handled on an approval basis.

"This works great in a data-center-centric operation, but imagine the befuddled expression on the comptroller's face when he gets his first 23,000-line-item bill from Amazon," Campbell said. "It is very hard to even decipher what these expenditures are for, let alone garner business justification."

Customers disappointed with cloud cost savings may end up migrating applications to hyper-converged infrastructures.

"Some [companies] are pulling in applications from the cloud to their data centers," Campbell said. "If they do that, we are seeing hyper-convergenceas being one of those enabling mechanisms."

In addition to cost, corporate culture can play a role in a cloud reversal.

"Executives who are not fully aware of the concepts of the cloud feel somewhat apprehensive that data is somewhere else and feel lack of control," Teodoro said.

Managing the declouding challenge

Assisting customers as they back out of the cloud can prove difficult. Teodoro said public clouds in which multiple customers share a common infrastructure represent the greatest challenge. Dealing with maintenance windows is one issue. A customer can't just extract an application based on its own ad hoc maintenance timetable; they have to wait for the cloud provider's scheduled downtime.

"You can't move when you want to move," Teodoro said. "You've got to move at somebody else's pace and schedule."

Determining a cloud-based application's dependencies with respect to the cloud provider's infrastructure is another consideration. A channel partner working on an off-the-cloud migration project needs to figure out what virtual machines the application resides on and identify the virtual LANs and subnets in the compute infrastructure to which the application can be traced, Teodoro explained. The goal: extract the application without breaking something in the environment.

"The keys for us are really to understand the dependencies in the environment -- down to the infrastructure -- and find ways to carve out the environment into smaller chunks or workgroups," Teodoro said.

Another complication: Migration tools can help channel companies uncloud customers, automating the tasks of data gathering, analysis and forensics. But in a shared, multi-tenant cloud, service providers can't use their own tools, since they could impact a cloud provider's other clients, Teodoro said.

Seeking a happy balance

Customers juggling multiple IT environments provide yet another degree of difficulty. Jim Piazza, vice president of service management at CenturyLink Inc., which offers colocation, public cloud and IT services, said customers such as software as a service providers may offer multiple versions of their software to support different clients. And those different versions may be hosted on different computing platforms: in-house private clouds, colocation centers and public clouds, for example.

"It's an interesting mix … that is really quite a challenge to manage," Piazza said.

Piazza said CenturyLink, based in Monroe, La., provides customers a service catalog to help them keep track of what version of their software is deployed where. In addition, the company has built interconnects between customers' colocation footprints in CenturyLink facilities and CenturyLink's public cloud. The service catalog and interconnects enable the company's clients to move their end customers from one platform to another, Piazza said.

Piazza likened migrating customers and their workloads among the various platforms to supporting a 3D jigsaw puzzle.

For Campbell, the cloud conundrum boils down to harmonizing the computing platforms now available to customers.

"It's finding that happy balance -- what lives best in the cloud and what lives best in-house."

Continue Reading About uncloud (de-cloud)

Dig Deeper on Cloud integration services and cloud enablement services

Picture of System Administrator

Unified Endpoint Management

by System Administrator - Thursday, 30 March 2017, 4:54 PM
 

Unified Endpoint Management (UEM)

Posted by: Margaret Rouse | Contributor(s): Colin Steele

Unified endpoint management (UEM) is an approach to securing and controlling desktop computers, laptops, smartphones and tablets in a connected, cohesive manner from a single console. Unified endpoint management typically relies on the mobile device management (MDM) application program interfaces (APIs) in desktop and mobile operating systems.

Microsoft's inclusion of MDM application program interfaces in Windows 10 made unified endpoint management a possibility on a large scale. Prior to the release of Windows 8.1, there was no way for MDM software to access, secure or control the operating system and its applications. 

In Windows 10, the tasks IT can perform through MDM software include:

  • configuring devices' VPN, email and Wi-Fi settings;
  • enforcing passcode and access policies;
  • installing patches and updates;
  • blacklisting and whitelisting applications; and
  • installing and managing Universal Windows Platform (.appx) and Microsoft Installer (.msi) applications.

Mobile device management is significantly less robust than traditional Windows management tools, however. Examples of tasks information technology (IT) administrators can't perform through Windows 10 MDM APIs include:

  • deploying and managing legacy executable (.exe) applications;
  • enforcing encryption;
  • deploying Group Policy Objects; and
  • managing printers, file shares and other domain-based resources.

Many vendors market UEM as a feature of their broader enterprise mobility management (EMM) software suites and some EMM vendors have made strides to close the gap between MDM and traditional Windows management tools. For example, MobileIron Bridge allows IT administrators to use MDM to deploy scripts that modify the Windows 10 file system and registry and perform other advanced tasks, including deploying legacy.exe applications.

Other vendors that support UEM include VMware, Citrix, BlackBerry and Apple. Apple's Mac OS X operating system has included MDM APIs since at least 2012, when AirWatch and MobileIron announced support. Today, all of the major vendors that offer UEM also support OS X.

Link: http://searchenterprisedesktop.techtarget.com

Picture of System Administrator

Unstructured Data

by System Administrator - Tuesday, 18 April 2017, 3:07 PM
 

Pulling Insights from Unstructured Data – Nine Key Steps

by Salil Godika

Data, data everywhere, but not a drop to use. Companies are increasingly confronted with floods of data, including “unstructured data” which is information from within email messages, social posts, phone calls, and other sources that isn’t easily put into a traditional column. Making sense and actionable recommendations from structured data is difficult, and doing so from unstructured data is even harder.

Despite the challenge, the benefits can be substantial. Companies that commit to examining unstructured data that comes from devices and other sources should be able to find hidden correlations and surprising insights. It promotes trend discovery and opens opportunities in ways that traditionally-structured data cannot.

Analyzing unstructured data can be best accomplished by following these nine steps:

1. Gather the data

Unstructured data means there are multiple unrelated sources. You need to find the information that needs to be analyzed and pull it together. Make sure the data is relevant so that you can ultimately build correlations.

2. Find a method

You need a method in place to analyze the data and have at least a broad idea of what should be the end result. Are you looking for a sales trend, a more traditional metric, or overall customer sentiment? Create a plan for finding a result and what will be done with the information going forward.

3. Get the right stack

The raw data you pull will likely come from many sources, but the results have to be put into a tech stack or cloud storage in order for them to be operationally useful. Consider the final requirements that you want to achieve and then judge the best stack. Some basic requirements are real-time access and high availability. If you’re running an ecommerce firm, then you want real-time capabilities and also want to be sure you can manage social media on the fly based on trend data.

 

4. Put the data in a lake

Organizations that want to keep information will typically scrub it and then store it in a data warehouse. This is a clean way to manage data, but in the age of Big Data it removes the chance to find surprising results. The newer technique is to let the data swim in a “data lake” in its native form. If a department wants to perform some analysis, they simply dip into the lake and pull the data. But the original content remains in the lake so future investigations can find correlations and new results.

5. Prep for storage

To make the data useful (while keeping the original in the lake), it is wise to clean it up. For example text files can contain a lot of noise, symbols, or whitespace that should be removed. Dupes and missing values should also be detected so analysis will be more efficient.

6. Find the useful information amongst the clutter

Semantic analysis and natural language processing techniques can be used to pull various phrases as well as the relationship to that phrase. For example “location” can be searched and categorized from speech in order to establish a caller’s location.

7. Build relationships

This step takes time, but it’s where the actionable insights lay. By establishing relationships between the various sources, you can build a more structured database which will have more layers and complexity (in a good way) then a traditional single-source database.

8. Employing statistical modeling

Segmenting and classifying the data comes next. Use tools such as K-means, Naïve Bayes, and Support Vector Machine algorithms to do the heavy lifting to find correlations. You can use sentiment analysis to gauge customer’s moods over time and how they are influenced by product offerings, new customer service channels, and other business changes. Temporal modeling can be applied to social media and forums to find the most relevant topics that are being discussed by your customers. This is valuable information for social media managers who want the brand to stay relevant.

9. End results matter

The end result of all this work has to be condensed down to a simplified presentation. Ideally, the information can be viewed on a tablet or phone and helps the recipient make smart real-time decisions. They won’t see the prior eight steps of work, but the payoff should be in the accuracy and depth of the data recommendations.

Every company’s management is pushing the importance of social media and customer service as the main drivers of company success. However, these services can provide another layer of assistance to firms after diagnostic tools are applied to their underlying data. IT staff need to develop certain skills in order to properly collect, store, and analyze unstructured data in order to compare it with structured data to see the company and its users in a whole new way.

 

About the author: Salil Godika is Co-Founder, Chief Strategy & Marketing Officer and Industry Group Head at Happiest Minds Technologies. Salil has 18 years of experience in the IT industry across global product and services companies. Prior to Happiest Minds, Salil was with MindTree for 4 years as the Chief Strategy Officer. Before MindTree, Salil spent 12 years in the United States working for start-ups and large technology product companies like Dassault Systems, EMC and i2 Technologies. His accomplishments include incubating a new product to $30million in revenue, successful market positioning of multiple products, global marketing for a $300million business and multiple M&As.

 Related Items:

Link: https://www.datanami.com

Picture of System Administrator

User Interface (UI)

by System Administrator - Saturday, 1 June 2013, 3:34 PM
 

Also see human-computer interaction.

In information technology, the user interface (UI) is everything designed into an information device with which a human being may interact -- including display screen, keyboard, mouse, light pen, the appearance of a desktop, illuminated characters, help messages, and how an application program or a Web site invites interaction and responds to it. In early computers, there was very little user interface except for a few buttons at an operator's console. The user interface was largely in the form of punched card input and report output.

Later, a user was provided the ability to interact with a computer online and the user interface was a nearly blank display screen with a command line, a keyboard, and a set of commands and computer responses that were exchanged. This command line interface led to one in which menus (list of choices written in text) predominated. And, finally, the graphical user interface (GUI) arrived, originating mainly in Xerox's Palo Alto Research Center, adopted and enhanced by Apple Computer, and finally effectively standardized by Microsoft in its Windows operating systems.

The user interface can arguably include the total "user experience," which may include the aesthetic appearance of the device, response time, and the content that is presented to the user within the context of the user interface.

RELATED GLOSSARY TERMS: search enginecyberprisenamespaceWebificationkiller app,service-component architecture (SCA)WebifyProject TangoPersonal Web Server (PWS),MQSeries

 
Contributor(s): Mike Dang
This was last updated in April 2005
Posted by: Margaret Rouse
Picture of System Administrator

Usuario Final / End User

by System Administrator - Sunday, 31 March 2013, 8:42 PM
 

Usuario Final es el operador del sistema. Hay diferentes perfiles de Usuarios, que van desde el Supervisor  hasta el que solamente utiliza ciertas funcionalidades acotadas y puntuales.

Picture of System Administrator

UTM vs. NGFW: Unique products or advertising semantics?

by System Administrator - Wednesday, 18 February 2015, 6:49 PM
 

UTM vs. NGFW: Unique products or advertising semantics?

by: Michael Heller

In comparing UTM vs. NGFW, organizations find it difficult to see if there are differences between the two products or if it is just marketing semantics.

It can often be difficult to discern the difference between unified threat management (UTM) and next-generation firewalls (NGFW). Experts agree that the lines appear to be blurring between the two product sets, but enterprises that focus on defining each product type during the purchasing process may be making a mistake.

NGFWs emerged more than a decade ago in response to enterprises that wanted to combine traditional port and protocol filtering with IDS/IPSfunctionality and the ability to detect application-layer traffic; over time they added more features like deep-packet inspection and malware detection. 

Meanwhile, UTM s were borne of a need for not only firewall functionality among small and midsize businesses, but also IDS/IPS, antimalware, antispam and content filtering in a single, easy-to-manage appliance. More recently UTMs have added features, like VPN, load-balancing and data loss prevention (DLP), and are increasingly delivered as a service via the cloud. 

According to Jody Brazil, CEO of Overland Park, Kan.-based security management firm FireMon LLC, SMBs and remote office locations were attracted to the UTM, but larger enterprises tended to favor the NGFW to standalone devices throughout the network, minimizing the impact on firewall performance.

Greg Young, research vice president for Stamford, Conn.-based Gartner Inc., said larger enterprises have had the budgets to buy the best technology, and the staff to support the more advanced features and better performance afforded by NGFWs. On the other hand, SMBs not only wanted an all-in-one product, but also needed extra support from the channel to manage the device, even if it meant that each feature of the UTM was good but not the best.

"Service providers for ISPs have different needs than enterprises," said Young. "So, UTM vendors will only offer basic firewall features as a price-play for that market."

Young said those differences in ease of use and support demands still exist today, though they have become more nuanced; there is overlap in the underlying technology of NGFW and UTM, and spec sheets tend to look similar. Young said that the key differences now are more around quality of features, and the level of support from channel partners to meet customer needs.

Young also noted that vendors tend to excel in one market or the other, like Fortinet Inc. with UTM for SMBs, or Palo Alto Networks Inc. with NGFW for enterprises. Few vendors can succeed in both, he said, like Check Point Software Technologies Ltd. has done.

"The confusion came from SMB vendors trying to move into the enterprise market without making channel and quality changes," said Young. "It was an intentional campaign to confuse, but very few end users are confused about what they need. It is either a racecar [NGFW] or a family van [UTM]."

Brazil admitted that the differences between NGFW and UTM can be confusing, even for experienced practitioners, but described UTM as a collection of unrelated security features, one of which is the firewall.

"UTM generally refers to a firewall with a mix of other 'bolted-on' security functions like antivirus and even email spam protection," said Brazil. "These are not access control features that typically define a firewall."

What traditionally has defined next-gen firewalls, Brazil said, is robust Layer 7 application access control, though an increasing number of NGFWs are being augmented with integrated threat intelligence, enabling them to deny known threats based on a broad variety of automatically updated policy definitions.  

However, Brazil did caveat his distinctions by saying that a UTM could be considered an NGFW if it met the Layer 7 parameters, and an NGFW that included malware functions could be considered a UTM. Though, he was clear that despite these potential overlap points, he would keep the classifications separate because of a lack of similarities in other respects, like access control.

Brazil said that NGFW will eventually become the standard, and the terms NGFW and firewall will become synonymous. He said UTM will remain an important product for SMBs, especially when a company prioritizes simplicity of deployment over the depth of security and performance, but NGFW and UTM will not converge because of performance and management concerns.

"The idea of a 'converged' network security gateway will continue to have appeal, so vendors will continue to add functionality to reduce cost of firewall ownership to the customer and increase revenue to the vendor," said Brazil. "However, issues with performance and manageability will continue to force separate, purpose-built systems that will be deployed in enterprise networks. As such, there will continue to be enterprise firewalls that should not be considered UTMs."

Mike Rothman, analyst and president for Phoenix-based security firm Securosis LLC., said he believes that UTM and NGFW are essentially the same, and the differences are little more than marketing semantics. Rothman agreed that marketing from vendors caused confusion, but also blamed analysts for adopting the term NGFW and driving it into the vernacular.

He said that early UTMs did have problems scaling performance from SMBs to larger enterprises, especially when trying to enforce both positive rules (firewall access) and negative rules (IPS), but that early NGFW had the same issues keeping up with wire speed when implementing threat prevention. He said that the perceived disparities were used to enforce market differentiation, and persist today, despite these scaling issues not being relevant anymore.

According to Rothman, the confusion lies not only in comparing the two device types, but also in the term "next-generation firewall" itself, which he thinks minimizes what the device does.

"What an NGFW does is bigger than just a firewall," said Rothman. "A firewall is about access control, basically enforcing what applications, ports, protocols, users, etc., are allowed to pass through the firewall. The NGFW also can look for and deny access to threats, like an IPS. So it's irritating that the device is called an NGFW, as it does more than just a firewall. We call it the Network Security Gateway, as that is a more descriptive term."

Rothman said that today's UTMs can do everything an NGFW can do, as long as they are configured properly and have the right policy integration. He said he believes that arguments about feature sets or target markets are examples of aritificial distinctions that only serve to confuse the issue.

"From a customer perspective, the devices do the same thing," Rothman said. "The NGFW does both access control and threat prevention, as does the UTM, just a little differently in some devices. Ultimately, the industry needs to focus on what's important: Will the device scale to the traffic volumes they need to handle with all of the services turned on? That's the only question that matters."

Moving forward, despite differences in opinions, the experts agree that enterprises shouldn't go into a purchasing process by trying to decide whether they need an NGFW or a UTM. Rather, the ultimate goal should always be to focus on the best product to solve their problems.

Rothman said that the distinctions will go away as low-end UTM vendors add more application-inspection capabilities and more traditional NGFW vendors go downmarket by offering versions suitable for SMBs. He also said he doesn't expect an end to confusing vendor marketing anytime soon, so enterprises need to be careful to ignore these semantics and focus on finding the right product to address security needs.

Young said that in the short term, UTM and NGFW will remain separate and will both continue to be mainstays for SMBs and larger enterprises respectively, and the decision around what device to use will be a question of need.

The question of UTM vs. NGFW is still divisive, and experts have different ideas regarding if and where the two technologies diverge when looking at the issue from a vendor perspective. However, when looking at the issue from a customer perspective, the experts agree that focusing on an enterprise's security needs will help to mitigate the confusion and lead to the right product.

"It isn't just about technology, it is about how small company's security is different than a big company's security," said Young. "It's all about the use case, not a 'versus.'"

 More:

Link: http://searchsecurity.techtarget.com