Glosario KW | KW Glossary

Ontology Design | Diseño de Ontologías

All categories

Page: (Previous)   1  ...  5  6  7  8  9  10  11  12  13  14  ...  63  (Next)


Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:30 PM
  • Jabber - Jabber is an initiative to produce an open source, XML-based inst...
  • Jikes - Jikes is an open source Java compiler from IBM that adheres strictl...
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:34 PM
  • K Desktop Environment - K Desktop Environment (KDE) is an O...
  • K Desktop Environment (KDE) - K Desktop Environment (...
  • KDE - K Desktop Environment (KDE) is an Open Source graphical desktop environ...
  • kernel panic - A kernel panic is a computer error from which the ope...
  • Korn shell - The Korn shell is the UNIX shell (command execution progr...
  • Kubuntu - Ubuntu (pronounced oo-BOON-too) is an open source Debian-based ...
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:33 PM
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:35 PM
  • Mason - HTML::Mason (generally referred to simply as Mason) is an open sour...
  • MEPIS - MEPIS (pronounced MEHP-us) is a Linux distribution that includes th...
  • minicom - Minicom (usually written with an initial lowercase m) is a text...
  • Minix - Minux (sometimes spelled MINIX) is an open source operating system ...
  • Moblin - Moblin is a Linux-based platform that is optimized for small comp...
  • Mono Silverlight - Moonlight is an open source implementation of...
  • Moonlight - Moonlight is an open source implementation of Microsoft's S...
  • Morphis - Morphis is a Java -based open source wireless transcoding platf...
  • Mozilla - Mozilla was Netscape Communication's nickname for Navigator, it...
  • MyODBC - MySQL Connector/ODBC (sometimes called just Connector ODBC or MyO...
  • MySQL - MySQL is a relational database management system (RDBMS) based on S...
  • MySQL Connector - MySQL Connector/ODBC (sometimes called just Con...
  • MySQL Connector/ODBC (Connector ODBC or MyODBC)
  • MySQL ODBC - MySQL Connector/ODBC (sometimes called just Connector ODB...
Picture of System Administrator

Marketing por Internet

by System Administrator - Saturday, 4 July 2015, 10:13 PM
Picture of System Administrator

Metología de Marco Lógico

by System Administrator - Monday, 22 December 2014, 10:54 PM

Formulación de programas con la metofología de marco lógico

El presente Manual busca ayudar al lector a entender y manejar los principios básicos del enfoque del Marco Lógico y su forma de aplicación para el diseño y posterior evaluación de proyectos y programas. Las razones para escribir el Manual {lcub}u2013{rcub}y para darle el contenido que él tiene- se basan en la renovada inquietud por contar con un instrumento, a la vez versátil y sumamente potente, para apoyar a la llamada Gestión para Resultados, en los programas y proyectos del sector público. Ello no implica que el Manual no sea útil para programas y proyectos del sector privado, sino que su redacción se ubica en un escenario en que sus usuarios esperados son funcionarios y consultores que trabajan en o para el sector público. La Gestión para Resultados es una inquietud compartida por los gobiernos de América Latina y el Caribe ante las urgencias del desarrollo económico, político y social de tipo sustentable, que necesitan los habitantes de la región para mejorar su calidad de vida1. Es un enfoque que tiene algunas características que lo diferencia de las formas de gestión gubernamental más tradicionales, como la gestión por funciones preestablecidas. El método del Marco Lógico, al igual que cualquier otro método, debe aplicarse dentro de un determinado contexto, que es su espacio de validez. Desgraciadamente es muy frecuente que se olvide este precepto básico y que se construya directamente una matriz con cuatro filas y cuatro columnas con la denominación propia de la Matriz de Marco Lógico, sin pasar por ninguna de las fases previas. Lo que resulte será, por cierto, una matriz de cuatro por cuatro, pero no necesariamente una Matriz de Marco Lógico, aunque se decida colocarle este nombre. Lo que define a la Metodología de Marco Lógico no es el producto final, sino el proceso que debe seguirse para llegar a la Matriz de Marco Lógico. Por ello, el presente manual no es un recetario para llenar matrices de cuatro filas y cuatro columnas, sino que explicita algunos caminos metodológicos para concluir con un Marco Lógico y su correspondiente matriz.

Por favor lea el archivo PDF adjunto.

Picture of System Administrator

Mobile Security

by System Administrator - Tuesday, 14 July 2015, 6:57 PM

Mobile Security

For today's enterprises, mobile security is becoming a top priority. As mobile devices proliferate in the workplace, companies need to be careful that these devices—as well as the networks and data that they access—are adequately protected. With the growing number of threats aimed at ex-ploiting mobile devices, safeguarding them is becoming complicated, but crucial.

In this eGuide, Computerworld UK, CSO, and IDG News Service examine some of the recent trends in mobile threats as well as ways to protect against them. Read on to learn how mobile security measures can help pro-tect your organization.

Please read the attached eGuide.

Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:36 PM
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:38 PM
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:39 PM
Picture of System Administrator

Preventing Data Loss Through Privileged Access Channels

by System Administrator - Wednesday, 7 October 2015, 4:14 PM

Preventing Data Loss Through Privileged Access Channels

A basic tenet of security is to apply the strongest safeguards to the highest value assets. Systems and IT administrators provide privileged users access to very high value assets and hosts across the enterprise. Their access rights include, to name a few, the ability to create new virtual machines, change operating system configurations, modify applications and databases, install new devices - and most of all, direct access to organization's protected data (financial or health, personnel, intellectual property, for example). If misused, the privileges they are granted can have devastating consequences.

Simultaneously the extent of privileged access is expanding to entities outside the enterprise through outsourcing arrangements, business partnerships, supply chain integration, and cloud services. The growing importance and prevalence of third-party access is bringing matters of trust, auditability and data loss prevention to the forefront of security compliance and risk management.

To be compliant against any type of standard or norms mandate privileged access to be secured by encryption, it is by default opaque to many standard layered defenses, such as next generation firewalls and data loss prevention systems. The resulting loss of visibility and control creates a heightened risk for undetected data loss and system damage as well as an attractive attack vector for malicious activity such as the stealing of information and disrupting operations, while hiding all traces from system audit logs. Auditors are always thoroughly testing privileged access controls as they are key controls for organizations such as those in the financial and health industries. Lack of visibility into administrator's activities will lead to audit exceptions.

This white paper focuses on how organizations facing these issues of privileged access can effectively balance the challenges of cost, risk and compliance. It describes how privileged access governance can be made minimally invasive, scale to enterprise requirements, and most importantly, prevent costly losses and potential audit exceptions.

Please read the attached whitepaper.

Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:40 PM
  • Quagga - Quagga is an open source suite of applications for the management...
  • Quiz: Open and shut! - A quiz about open source, proprietary...
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:41 PM
Picture of System Administrator


by System Administrator - Friday, 27 February 2015, 11:27 AM

RASP helps apps protect themselves, but is it ready for the enterprise?

by Nicole Laskowski

A new technology called runtime application self-protection is being touted as a next big thing in application security. But not everyone is singing its praises.

In the application economy, a perimeter defense is no longer a good offense. With the proliferation of mobile devices and cloud-based technologies, perimeters are all but disappearing, according to Joseph Feiman, an analyst with Gartner Inc. "The more we move from place to place with our mobile devices, the less reliable perimeter-based technology becomes," he said.

Firewalls and intrusion prevention systems, which enterprises spent an estimated $9.1 billion on last year, still serve a vital purpose. But, given the enterprise infrastructure's growing sprawl, CIOs should be thinking about security breadth as well as security depth and how to scale their strategies down to the applications themselves, even building in a strikingly human feature: self-awareness.

A new tool for the application security toolbox known as runtime application self-protection (RASP) could help CIOs get there, but, according to one expert, it's no silver bullet.

Joseph Feiman

Guarding the application

The security measures many CIOs have in place don't do much to safeguard actual applications, according to Feiman. Network firewalls, identity access management, intrusion detection or endpoint protection provide security at different levels, but none of them can see beyond the application layer. "Can you imagine a person who walks out of the house and into the city always surrounded by bodyguards because he has no muscles and no skills," Feiman said. "That is a direct analogy with the application." Strip away features like perimeter firewalls, and the application is basically defenseless.

Defenseless applications leave enterprises vulnerable to external -- and internal -- threats. "High-profile security breaches illustrate the growing determination and sophistication of attackers," said Johann Schleier-Smith, CTO at if(we), a social and mobile technology company based in San Francisco. "They have also forced the industry to confront the limitations of traditional security measures."

Gary McGraw

Application security testing tools help detect flaws and weaknesses, but the tools aren't comprehensive, Feiman said during a Gartner Security and Risk Management Summit last summer. Static application security testing, for example, analyzes source, binary or byte code to uncover bugs but only before the application is operational. Dynamic application security testing, on the other hand, simulates attacks on the application while it's operational and analyzes the response but only for Web applications that use HTTP, according to Gary McGraw, CTO of the software security consulting firm Cigital Inc.

Even when taken together, these two technologies still can't see what happens inside the application while it's operational. And, according to Feiman's research report Stop Protecting Your Apps; It's Time for Apps to Protect Themselves, published in September 2014, static and dynamic testing, whether accomplished with premises-based tools or purchased as a service, can be time-consuming and hard to scale as the enterprise app portfolio multiples.

Is RASP the answer?

That's why Feiman is keeping an eye on a budding technology Gartner calls RASP or runtime application self-protection. "It is the only technology that has complete insight into what's going on in the application," he said.

RASP, which can be applied to Web and non-Web applications, doesn't affect the application design itself; instead, detection and protection features are added to the servers an application runs on. "Being a part of the virtual machine, RASP sees every instruction being executed, and it can see whether a set of instructions is an attack or not," he said. The technology works in two modes: It can be set to diagnostic mode to sound an alarm; or it can be set to self-protection mode to "stop an execution that would lead to a malicious exploit," Feiman said.

The technology is offered by a handful of vendors. Many, such as Waratek, founded in 2009, are new to the market, but CIOs will recognize at least one vendor getting into the RASP game: Hewlett-Packard. Currently, RASP technology is built for the two popular application servers: Java virtual machine and .NET Common Language Runtime. Additional implementations are expected to be rolled out as the technology matures.

While Feiman pointed to the technology's "unmatched accuracy," he did note a couple of challenges: The technology is language dependent, which means the technology will have to be implemented separately for Java virtual machine versus .NET CLR. Because RASP sits on the application server, it uses CPUs. "Emerging RASP vendors report 2% to 3% of performance overhead, and some other evidence reports 10% or more," Feiman wrote inRuntime Application Self-Protection: Technical Capabilities, published in 2012.

Is it ready for primetime?

Not everyone is ready to endorse RASP. "I don't think it's ready for primetime," said Cigital's McGraw. RASP isn't a bad idea in principle, he said, "but in practice, it's only worked for one or two weak categories of bugs."

The statement was echoed by if(we)'s Schleier-Smith: "What remains to be seen is whether the value RASP brings beyond Web application firewalls and other established technologies offsets the potential additional complexity," he said.

CIOs may be better off creating an inventory of applications segmented by type -- mobile, cloud-based, Web-facing. "And choose the [security] technology stack most appropriate for the types of applications found in their portfolio," McGraw said.

Even Feiman stressed that CIOs need to find a use case for the technology and consider how aggressive in general the organization is when adopting emerging technologies. For more conservative organizations, investing in RASP could still be two to five years out, he said.

To strengthen application security right now, McGraw urged CIOs to remember the power of static testing, which works on all kinds of software. And he suggested they investigate how thoroughly tools such as static and dynamic testing are being utilized by their staff. "The security people are not really testing people," he said, referring to software developers. "So when they first applied dynamic testing to security, nobody bothered to check how much of the code was actually tested. And the answer was: Not very much."

An even better strategy: Rather than place too much emphasis on RASP or SAST or DAST, application security should start with application design. "Half of software security issues are design problems and not silly little bugs," McGraw said.

Let us know what you think of the story; email Nicole Laskowski, senior news writer, or find her on Twitter @TT_Nicole.


Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:42 PM
  • Samba - Samba is a popular freeware program that allows end users to access...
  • Secure Shell Charter - The Secure Shell Charter (SSH Charter...
  • Secure Shell Charter (SSH Charter) - The Secur...
  • shared source - Shared source is a software licensing concept that ...
  • Slackware - Slackware is the earliest distribution of the Linux operati...
  • Slashdot - Slashdot is a socially curated website dedicated to technolog...
  • smbclient - Samba is a popular freeware program that allows end users t...
  • Snort - Snort is an open source network intrusion detection system (NIDS) c...
  • SnortSnarf - SnortSnarf is a program that was designed for use with Sn...
  • social computing - Social computing is the collaborative and int...
  • social spreadsheet - A social spreadsheet is social software t...
  • software package - A software package is an assemblage of files ...
  • source code - Source code and object code refer to the "before" and "...
  • Squid proxy server - Squid is a Unix-based proxy server that c...
  • SSH Charter - The Secure Shell Charter (SSH Charter) is a set of pape...
  • Subversion - Subversion is a version control system that keeps track o...
  • Sugar - Sugar is a graphical user interface GUI developed for the $100 lapt...
  • SuSE - SuSE (pronounced soo'-sah) is a German Linux distribution provider an...
Picture of System Administrator

Security Think Tank

by System Administrator - Monday, 16 February 2015, 12:17 AM

Security Think Tank: Control smart devices and apps like the rest of ICT

by Mike Gillespie

No device, system or protocol will ever be 100% secure 100% of the time. When we talk about managing and mitigating risk we assume that appetite and tolerance have been established and create our policies and procedures within those boundaries.

This applies to individuals too, though they probably do not realise they are doing it. Every time they get the warning message before installing an app they make a decision about whether the level of access to their device – and therefore their information – is appropriate or not. They are making a choice between the risk to their data and the convenience of the app they want to use. 

Some may not take this very seriously and merely click through, allowing apps whatever access they request. These people may also not apply a similarly critical eye to establishing their device’s security, making a hack of their personal data – say photos – much easier to achieve.

That is what we saw recently with the iCloud nude photo stories that seemed ubiquitous for a couple of weeks. It is for these reasons that we must control smart devices and the apps on them in the same way we have traditionally controlled the rest of our ICT infrastructure.

But what does this mean for business? What can IT security teams do to minimise some of the risk from insecure apps and insecure cloud backup?

There are a number of ways in which IT security teams can attempt to prevent users from unwittingly uploading sensitive corporate data to an insecure cloud back up service:

  1. Ensure users receive regular security education, highlighting the issues related to storing data in the cloud. If the training is at a more personal level, they are more likely to remember it (for instance, use an example of the user storing personal data in the cloud and the impact of any leak or breach);
  2. Drive forward a culture of individual accountability. Underpin this culture with a robust set of security policies that reinforces the message to users that any accidental or intentional release of information is their responsibility, and may result in some form of disciplinary procedure;
  3. Wherever possible, minimise the risk of human error by restricting the ability to move data into a cloud environment only to appropriate users. This should be driven by their actual need to have this ability;
  4. Introduce some form of "splash screen" notification that reminds users they may be about to upload sensitive information to an insecure cloud backup service.  In other words, a polite reminder that what they are about to do could go against company policy;
  5. Implement an in-depth protective monitoring policy, which would include a "word scan" program to block any email/document transfer out of the network if one of a selection of words is detected;
  6. Secure the services of a reputable and trusted secure cloud service provider. This provider should be one that will welcome careful stipulations in your service level agreement, such as employee vetting, the right to audit, review of data sanitisation practices etc;
  7. An extreme measure would be to prevent all users from directly uploading to a backup cloud service but to force channel through an IT department or IT provider, which would scrutinise any upload first and release to the cloud if the upload met a set of pre-defined authorisation conditions;
  8. Review of devices following any operating system or application changes that invariably affect or neutralise security settings;
  9. If you are concerned about the possible compromise of sensitive data, don’t backup to a cloud service provider;
  10. Ensure policies, measures and procedures are understood and apply to all staff, including senior management and the board. These are the ones frequently overlooked or have a blind eye turned to their behaviour – but they are just as fallible as anyone else.

A consistent approach is needed across an organisation and this will make your use of technology far more successful. The importance of contextualised education and refresher training is not to be underestimated.

In this, as in all areas of security, be iterative, be vigilant – and be prepared to react to emerging threats.

Mike Gillespie is director of cyber research and security at The Security Institute


Picture of System Administrator

Serious Games | Juegos Serios

by System Administrator - Wednesday, 24 May 2017, 12:41 PM


Serious Games | Juegos serios

De Wikipedia, la enciclopedia libre

 Los juegos serios (del inglés "serious game"), también llamados "juegos formativos", son juegos diseñados para un propósito principal distinto del de la pura diversión. Normalmente, el adjetivo "serio" pretende referirse a productos utilizados por industrias como la de defensa, educación, exploración científica, sanitaria, urgencias, planificación cívica, ingeniería, religión y política.


Descripción general

El término «juego serio» ha existido desde mucho antes de la entrada en el mundo del entretenimiento de los dispositivos informáticos y electrónicos. En 1970, Clark Abt ya definió este término en su libro Serious Games, publicado por Viking Press.[1] En este libro, Abt habla principalmente de los juegos de mesa y de los juegos de cartas, pero proporciona una definición general que puede aplicarse con facilidad a los juegos de la era informática:

Reducido a su esencia formal, un juego es una actividad entre dos o más personas con capacidad para tomar decisiones que buscan alcanzar unos objetivos dentro de un contexto limitado. Una definición más convencional es aquella en la que un juego es un contexto con reglas entre adversarios que intentan conseguir objetivos. Nos interesan los juegos serios porque tienen un propósito educativo explícito y cuidadosamente planeado, y porque no están pensados para ser jugados únicamente por diversión.

En 2005, Mike Zyda abordó este término de una forma actualizada y lógica en un artículo publicado en la revista ''Computer'' de la IEEE Computer Society que llevaba por título «From Visual Simulation to Virtual Reality to Games».[2] Zyda define primero «juego» y continúa a partir de aquí:

  • Juego: una prueba física o mental, llevada a cabo de acuerdo con unas reglas específicas, cuyo objetivo es divertir o recompensar al participante.
  • Videojuego: una prueba mental, llevada a cabo frente a una computadora de acuerdo con ciertas reglas, cuyo fin es la diversión o esparcimiento, o ganar una apuesta.
  • Juego serio: una prueba mental, de acuerdo con unas reglas específicas, que usa la diversión como modo de formación gubernamental o corporativo, con objetivos en el ámbito de la educación, sanidad, política pública y comunicación estratégica.

Mucho antes de que el término «juego serio» empezara a ser usado por la Serious Games Initiative en 2002, ya empezaron a crearse juegos con un propósito distinto del entretenimiento. El continuo fracaso de los juegos de entretenimiento educativo en cuanto a su rentabilidad, junto a las crecientes capacidades técnicas de los juegos para proporcionar escenarios realistas, llevó a la reexaminación del concepto de juegos serios a finales de la década de los 90. Durante este tiempo algunos estudiosos comenzaron a examinar la utilidad de los juegos para otros propósitos, contribuyendo al creciente interés por emplearlos con nuevos fines. Además, la capacidad de los juegos para contribuir a la formación se vio ampliada con el desarrollo de los juegos multijugador. En 2002, el Centro Internacional para Académicos Woodrow Wilson en Washington D.C. creó la Serious Games Initiative con el fin de fomentar el desarrollo de juegos sobre temas políticos y de gestión. Aparecieron grupos más especializados en 2004, como por ejemplo Games for Change, centrado en temas sociales y en cambio social, y Games for Health, sobre aplicaciones relacionados con la asistencia sanitaria.

No hay una única definición del término «juego serio», aunque se entiende que hace referencia a juegos usados en ámbitos como la formación, la publicidad, la simulación o la educación. Definiciones alternativas incluyen conceptos propios de los juegos y las tecnologías, así como nociones provenientes de aplicaciones no relacionadas con el entretenimiento. Los juegos serios empiezan a incluir también hardware específico para videojuegos, como por ejemplo de los videojuegos para mejorar la salud y la forma física.

Los videojuegos son una herramienta a tener en cuenta en la estimulación cognitivo afectiva, que favorecen el aprendizaje, la autoestima, potencian la creatividad y las habilidades digitales, al mismo tiempo que generan motivación y entretenimiento. Los videojuegos suponen una modalidad de enseñanza que debe ser aprovechada por la comunidad educativa, por la cantidad de elementos emocionales que integran, su estimulación sensorial y la posibilidad de inmersión a través de los ambientes virtuales en los que se desenvuelven.[3]

Los juegos serios están dirigidos a una gran variedad de público, desde estudiantes de educación primaria y secundaria a profesionales y consumidores. Los juegos serios pueden ser de cualquier género, usar cualquier tecnología de juegos y estar desarrollados para cualquier plataforma. Algunos los consideran un tipo de entretenimiento educativo, aunque el grueso de la comunidad se resiste a utilizar este término.

Un juego serio puede ser una simulación con la apariencia de un juego, pero está relacionado con acontecimientos o procesos que nada tienen que ver con los juegos, como pueden ser las operaciones militares o empresariales (aunque muchos juegos populares de entretenimiento están basados en operaciones militares o empresariales). Los juegos están hechos para proporcionar un contexto de entretenimiento y autofortalecimiento con el que motivar, educar y entrenar a los jugadores. Otros objetivos de estos juegos son el marketing y la publicidad. Los grandes usuarios (algo no demostrado por la inteligencia empresarial) de los juegos serios parecen ser el gobierno de los Estados Unidos y los médicos.[cita requerida] Otros sectores comerciales están también persiguiendo activamente el desarrollo de este tipo de herramientas.



La idea de usar juegos en la educación data de los días anteriores a la aparición de las computadoras, pero se considera que el primer juego serio fue Army Battlezone, un proyecto fallido liderado por Atari en 1980 que fue diseñado para usar el videojuego arcade Battlezone como entrenamiento militar. En los últimos años, el gobierno y el ejército de Estados Unidos han buscado periódicamente desarrolladores de videojuegos para crear simulaciones de bajo coste que fueran precisas y entretenidas por igual. La experiencia de los desarrolladores de videojuegos en la mecánica y el diseño de juegos los convierte en los candidatos perfectos para desarrollar este tipo de simulaciones que cuestan millones de dólares menos que las simulaciones tradicionales que, con frecuencia, requieren de un hardware especial o de completas instalaciones para su uso.

Fuera del ámbito gubernamental, existe un considerable interés en juegos sobre educación, formación profesional, asistencia médica, publicidad y políticas públicas. Por ejemplo, juegos de sitios web como son, en palabras de Henry Jenkins, director del programa de estudios comparativos de medios de MIT, «juegos muy políticos creados fuera del sistema empresarial» que están «planteando asuntos a través de los medios pero usando las propiedades únicas de los juegos para atraer a la gente desde una nueva perspectiva». Estos juegos, ha dicho Henry Jenkins, constituyen un «trabajo de ficción radical». La Universidad Estatal de Míchigan ofrece un máster y un certificado de posgrado sobre diseño de juegos serios.[4] En Europa, la Universidad de Salford creó en 2005 un máster sobre juegos creativos.[5]


Los desarrolladores de videojuegos están acostumbrados a desarrollar juegos de forma rápida y son duchos en crear juegos que simulan —en diversos grados— entidades funcionales como radares y vehículos de combate. Usando la infraestructura existente, los desarrolladores de videojuegos pueden crear juegos que simulen batallas, procedimientos y eventos por una fracción del costo de los contratistas tradicionales del gobierno.

El desarrollo y empleo de los simuladores tradicionales cuesta normalmente millones de dólares, además de que en general estos simuladores requieren de hardware especializado. El coste medio de los juegos serios es muy bajo. En vez de los grandes volúmenes de medios o computadoras que necesitan los simuladores de alta calidad, los juegos serios no requieren más que un DVD o un CD-ROM, exactamente igual que los videojuegos tradicionales. Su distribución se limita a enviarlos por correo o permitir su acceso mediante un sitio web dedicado.

Por último, al tiempo que los juegos serios están pensados para formar o educar a los usuarios, lo están también para entretener. Los desarrolladores de videojuegos tienen experiencia a la hora de crear juegos divertidos y atractivos ya que su sustento depende de ello. En el curso de los eventos y los procedimientos que se simulan, los desarrolladores automáticamente inyectan dosis de entretenimiento y jugabilidad a sus aplicaciones.


Clasificación y subgrupos de juegos serios

Aunque la clasificación de los juegos serios es algo que todavía tiene que consolidarse, existen sin embargo una serie de términos cuyo uso razonablemente común permite su inclusión aquí.

  • Advergaming: del inglés advertising y game, es decir, publicidad y juego, es la práctica de usar videojuegos para publicitar una marca, producto, organización o idea.
  • Edutainment: este es un término que resulta de la unión de education y entertainment, es decir, educación y entretenimiento o diversión. Se aplica a los programas que enseñan mediante el uso de recursos lúdicos.
  • Aprendizaje basado en juegos (del inglés Educational game): estos juegos tienen como objetivo mejorar el aprendizaje. Están diseñados en general manteniendo un equilibrio entre, por un lado, la materia y, por otro, la jugabilidad y la capacidad del jugador para retener y aplicar dicha materia en el mundo real.[6] Este último de juegos se utilizan en el mundo empresarial para mejorar las capacidades de los empleados en temas, atención al público y negociaciones.
  • Edumarket Games: cuando un juego serio combina varios aspectos (por ejemplo, los propios del advergaming y del edutainment u otros relacionados con la prensa y la persuasión), se dice que la aplicación es un juego de tipo edumarket, término que resulta de la unión de education (educación) y marketing. Un ejemplo es Food Force, un juego con objetivos en el ámbito de las noticias, la persuasión y el edutainment.
  • News Games: son juegos periodísticos (del inglés news, es decir, noticia) que informan sobre eventos recientes o expresan un comentario editorial.
  • Simuladores o videojuegos de simulación: son juegos que se emplean para adquirir o ejercitar distintas habilidades o para enseñar comportamientos eficaces en el contexto de situaciones o condiciones simuladas. En la práctica, son muy usados los simuladores de conducción de vehículos (coches, trenes, aviones, etc., como por ejemplo FlightGear), los simuladores de gestión de compañías (por ejemplo, Transport Tycoon) y los simuladores sobre negocios en general, que ayudan a desarrollar el pensamiento estratégico y enseñan a los usuarios los principios de la micro y macroeconomía y de la administración de empresas (por ejemplo, Virtonomics).
  • Juegos persuasivos: del inglés persuasive games, son juegos que se usan como tecnología de la persuasión.
  • Juegos organizativos dinámicos: del inglés organizational-dynamic games, son juegos que enseñan y reflejan la dinámica de las organizaciones a tres niveles: individual, de grupo y cultural.
  • Juegos para la salud: del inglés games for health, son juegos diseñados como terapia psicológica, o juegos para el entrenamiento cognitivo o la rehabilitación física.
  • Juegos artísticos: del inglés art games, son juegos usados para expresar ideas artísticas, o arte creado utilizando como medio los videojuegos.
  • Militainment: este es un término que resulta de la unión de military y entertainment, es decir, militar y entretenimiento o diversión. Son juegos financiados por el ejército o que, de lo contrario, reproducen operaciones militares con un alto grado de exactitud.

Julian Alvarez y Olivier Rampnoux (del European Center for Children's Productos de la Universidad de Poitiers) han tratado de clasificar los juegos serios en 5 categorías principales: advergaming, edutainment, juegos de tipo edumarket, juegos de denuncia (que los autores denominan diverted games) y juegos de simulación.[7]

Ejemplos de juegos serios

Hotzone: es un simulador multijugador en red que utiliza la tecnología de los videojuegos para entrenar a los equipos de emergencia, bomberos y efectivos de protección civil para responder ante situaciones de peligro. El objetivo principal de la simulación es la comunicación, la observación, y toma de decisiones críticas.

Food Force: juego educativo elaborado bajo la supervisión del Programa de Alimentación Mundial de las Naciones Unidas, en el que el objetivo es acabar con la situación de hambruna que ha generado un conflicto bélico en una zona determinada. Entre cada prueba se proyectan vídeos que dan a conocer la situación de estos países y la forma en que la ONU se enfrenta a ellos.

Re-Mission: es un videojuego completamente gratuito creado para HopeLab, una asociación de ayuda a enfermos con cáncer. Permite luchar contra la enfermedad al mostrar a un nanorobot “Roxxie” erradicando células cancerosas gracias a la quimioterapia. A través de este juego se informa de diferentes tipos de cáncer, y tiene también utilidad para liberar la rabia y sentimientos de rechazo hacia la enfermedad. Se trata de un juego de acción con un alto valor pedagógico y de concienciación sobre el cáncer.

Merchants: es un videojuego creado para la formación de habilidades de negociación y gestión de conflictos. Los jugadores inmersos en la Venecia del siglo XV se enfrentan a diferentes situaciones por medio de las cuales ponen en práctica los contenidos teóricos impartidos durante el juego. Navieros es un producto de Gamelearn

Triskelion: es un videojuego creado para la formación en gestión del tiempo y productividad personal. Los jugadores se convierten en Robert Wise, personaje mediante el cual tendrán que seguir las pistas que revelan el secreto de la Orden de la sabiduría. Triskelion es un producto de Gamelearn

GABALL: Gamed Based Language Leaning, es un juego para profesionales gerentes de PYMES para fomentar la internacionalización de estas. El objetivo del proyecto GABALL va dirigido mejorar las competencias y habilidades de los gerentes de las PYMEs y Micro empresas para poner en marcha procesos de internacionalización de los mercados internos y externos (Brasil) a través de plataformas de comercio electrónico. GABALL también se dirige a los estudiantes de los últimos cursos de educación superior que potencialmente pueden llegar a ser empresarios y/o están promoviendo proyectos emprendedores. El objetivo último será mejorar sus competencias culturales y en lengua extranjera y así optimizar la utilización del marketing electrónico y las herramientas de comercio electrónico, el establecimiento de relaciones a través de medios electrónicos apoyados en las redes sociales y el fomento del espíritu emprendedor.[8]

Save the PKU Planet: Videojuego que enseña a los niños con fenilcetonuria cómo manejar de manera correcta su enfermedad, principalmente mediante el control de alimentos bajos en proteínas, la discriminación de los alimentos permitidos de los prohibidos y la motivación a la ingesta del complemento alimenticio sustitutivo rico en Aminoácidos y otros nutrientes pero exento de fenilalanina. Desarrollado por el Proyecto de Innovación Docente número 10-120 de la Universidad de Granada en colaboración con la Fundación Alicia y FX Animation. Disponible y adaptado para niños con fenilcetonuria de España, Dinamarca, Reino Unido y Estados Unidos. También está disponible con subtítulos en catalán.

DonostiON es un videojuego casual desarrollado por Ikasplay centrado en el día grande de la tamborrada de San Sebastián, que se celebra cada 20 de enero en la capital Guipuzcoana. Es un videojuego que promueve la cultura tamborrera que se desarrolla en esos días en la ciudad.

Hackend es un juego gratuito para aprender cibereseguridad en las empresas, eso sí, de forma amena de INCIBE. Ayudarás a un empresario, llamado Max, a resolver nueve casos en los que se ve comprometida la seguridad de su pequeña empresa. Es una aventura gráfica en la que Max, con tu inestimable ayuda, persigue a los ciberdelincuentes hasta dar con ellos.

Juegos serios de realidad alternativa

Los juegos de realidad alternativa (Serious ARG, Alternate reality game), son juegos serios basados en el mundo real que, utilizando toda una serie de recursos audiovisuales, desarrollan una historia que se verá constantemente afectada por la intervención de sus participantes. Los objetivos se centran en favorecer las actitudes de superación personal, aprovechar la dinámica de juego para intensificar los procesos de aprendizaje, fomentar los entornos colaborativos y la comunicación para solucionar los problemas y acertijos planteados.

World without oil es un juego serio de realidad alternativa (Serious ARG) en el que los videojugadores, son los responsables de gestionar el problema que nuestra sed desenfrenada de petróleo representa para nuestra economía, el clima y la calidad de vida y dar soluciones trabajando de forma colaborativa y con creatividad.


  1. Abt, C.: Serious Games, New York: Viking Press, 1970.
  2. Zyda, M.: "From visual simulation to virtual reality to games", en Computer, 38, 2005, pp. 25-32.
  3. MARCANO LÁREZ, Beatriz (2006) Estimulación emocional de los videojuegos: efectos en el aprendizaje. En GARCIA CARRASCO, Joaquín (Coord.) Estudio de los comportamientos emocionales en la red. Revista electrónica Teoría de la Educación.
  4. Máster sobre diseño de juegos serios de la Universidad Estatal de Míchigan.
  5. Máster sobre juegos creativos de la Universidad de Salford.
  6. El libro Digital Game-Based Learning de Marc Prensky fue la primera publicación importante en definir este término: sitio web oficial del libro Digital Game-Based Learning de Marc Prensky
  7. Alvarez, J. and O. Rampnoux: “Serious Game: Just a question of posture?”, en Artificial & Ambient Intelligence (AISB '07), 2007, pp. 420-423.
  8. Proyecto GABALL. 

Enlaces externos


Picture of System Administrator

SLAs for the Cloud

by System Administrator - Thursday, 13 July 2017, 9:49 AM



Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:43 PM
  • tcpdump - Tcpdump is an open source command-line tool for monitoring (sni...
  • The Open Group - The Open Group is a software standards organizati...
  • Tomcat - Tomcat is an application server from the Apache Software Foundati...
  • Torvalds, Linus - Linus Torvalds, the creator of the Linux kernel...
  • Tsunami UDP - Tsunami UDP is an open source file transfer protocol th...
  • Turtle Firewall - Turtle Firewall is an open source firewall prog...
Picture of System Administrator

Tips to create flexible but clear manuals

by System Administrator - Tuesday, 29 August 2017, 9:53 PM

Good application deployment manuals are thorough but usable. Follow these tips to create flexible but clear manuals that contribute to release management best practices.

Ways to make the application deployment process clear and flexible

Picture of System Administrator

Top 10 Considerations when Selecting a Secure Text Messaging Solution

by System Administrator - Monday, 13 July 2015, 5:25 PM

Top 10 Considerations when Selecting a Secure Text Messaging Solution

Evaluating Secure Text Messaging solutions can cause anyone’s eyes to glaze over in dreaded anticipation. But the process doesn’t have to be laborious, overwhelming, or fraught with perils when you know the right questions to ask.

Please read the attached whitepaper.

Picture of System Administrator

Top 10 ways to have your project end up in court

by System Administrator - Thursday, 13 August 2015, 2:03 PM

Top 10 ways to have your project end up in court

By David Taber

David Letterman may be off the air, but his Top 10 List format remains in the comedic canon. In his honor, here’s David Taber-man’s Top 10 list of these worst practices for agile projects.

As someone who’s sometimes called to be an expert witness, I’ve had to testify in arbitrations and court proceedings about the best practices in agile project management. Of course, when things devolve to the point of legal action, there haven’t been a lot of “best practices” in play by either side. Suffice it to say I’ve seen more than a few blunders by clients.

Here are the ones that show up over and over again:

10. Give the consultant ambiguous requirements, then start using rubber yardsticks*

Nothing’s more comforting to the client than the idea that they'll get exactly what they want, even if they never put sufficient energy into specifying exactly what that is. This goes double for the high-risk area of custom software development. So state your requirements vaguely, to make sure that anything you dream up later can be construed as being within the bounds of your original wording. This tactic works best when you don't really understand the technology, but you do know you need the deliverable to be infinitely detailed yet easy enough for your grandmother to use without training. This tactic is even more effective when you start having detailed conversations about requirements during acceptance testing, when there are no development funds left.

[Related: Top 10 project management certifications]

[*What’s a rubber yardstick? Instead of being like a regular yardstick that is a straight line of fixed length, the rubber yardstick stretches and shrinks and even bends and twists to connect dots that aren’t even on the same plane.]

9. Don't put decisions in writing or email

Writing things down merely ties you down, and that just limits your flexibility in the future (see #10). Much better to give verbal feedback in wide-ranging meetings that never really come to a conclusion. During these meetings, make sure that many attendees are texting or responding to fire-drills unrelated to your project, so they lose focus and have no recollection of what was said. When it comes to signing off requirements, monthly reviews or acceptance testing – just ignore this bean-counting detail!

8. Under-staff your team

You're paying good money for the consultant to do their job, so there's no point in over-investing in your own team members. Put in no-loads and people who don't care, so that the people who actually know what they’re doing can stick to their regular tasks. Once you have your drones in place, make sure to undercut their authority by questioning every decision. No point in motivating anybody – you're already paying them more than they deserve!

7. Blow off approval cycles, wireframe reviews and validation testing

You've got to focus on the big picture of making your business more profitable, so you don’t have time to get into the niggling details of this software project. Besides, business processes and policy decisions are boring and can be politically charged. So when some pesky business analyst asks you to validate the semantic interpretation of a business rule, just leave that mail in your inbox. It'll keep. Later, when it comes to testing and reviews, just send a flunkie with no decision-making authority to check things out.

6. Blatantly intimidate your team

Review meetings should be an expression of your personal power, prestige and authority. Change your mind endlessly and capriciously about details. Change the subject when substantive issues are brought up. Discuss how much your new shoes cost. Punish any questioner. Trust no one (not even your own team members), and make sure that trust never gets a chance to build within the team. Make sure team members know to hide bad news. Use blame as a weapon.

5. Focus on big-bang, slash cut projects with lots of dependencies

Crash programs are the way to get big things done in a hurry. Incrementalism is for wimps and weenies without the imagination to see the big picture. Since complex projects probably involve several vendors, make sure that nothing can progress without your direction and approval. Do not delegate – or if you do, don't empower the delegate to do anything. You wouldn't want to lose control!

4. Switch deadlines and priorities frequently

If the generals are right in saying that all your plans go out the window as soon as the first shot is fired, there's no point in planning realistically in the first place. Make sure to have deadlines for things with lots of dependencies, and then move those deadlines three or four times during the project. This’ll ensure that the project will involve inordinate waste and overhead – but hey, that’s the consultant’s problem, not yours.

[Related: Agile project management lessons learned from Texas Hold'em]

3. Have no contingency plan and no tolerance for budget shifts

It's pedal to the metal – nobody has time for insurance policies. You can't afford to run two systems in parallel, validate live transactions or reconcile variances before full production use. Make sure you dedicate 100 percent of your budgetary authority to the vendors, so there's no way to reallocate funds...let alone have a buffer to handle the unexpected. This works even better when your company enforces a use-it-or-lose-it financial regime.

2. Squeeze the vendor as early in the project as you can

Get your money’s worth. It's never too early to start squeezing your vendors to get the most out of them. Their negative profit margin is not your problem. Show 'em who's really boss. As the project nears its end-game, start modifying the production system yourself, and begin phase-2 work before phase-1 work has been signed off. Configuration control is for weenies.

And the #1 way to make sure your project ends up in court…

1. Don't monitor your own budget and pay little attention at status reviews

Ignore invoices and progress-against-goals reports. Make sure the integrator doesn't know you are not paying attention. Don’t ask questions at project review meetings. Delete emails that bore you. The vendor is there to deliver, so the details and consequences of project management are not your problem. As the project nears its deadline, insist on extra consultant personnel on site without giving any written authorization for extra charges.

Before I say anything more, I have to make it really clear that I’m not an attorney, and none of this is to be construed or used as legal advice. (Yes, my lawyer made me write that.) So get counsel from counsel about the best ways to remedy or prevent the issues above.

As I said at the start, projects that are deeply troubled have problems rooted in the behavior of both the client and the consultant. Next time, I’ll have a Top 10 list for consultants to make sure they end up in court, too.


Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:44 PM
Picture of System Administrator

Unstructured Data

by System Administrator - Tuesday, 18 April 2017, 3:07 PM

Pulling Insights from Unstructured Data – Nine Key Steps

by Salil Godika

Data, data everywhere, but not a drop to use. Companies are increasingly confronted with floods of data, including “unstructured data” which is information from within email messages, social posts, phone calls, and other sources that isn’t easily put into a traditional column. Making sense and actionable recommendations from structured data is difficult, and doing so from unstructured data is even harder.

Despite the challenge, the benefits can be substantial. Companies that commit to examining unstructured data that comes from devices and other sources should be able to find hidden correlations and surprising insights. It promotes trend discovery and opens opportunities in ways that traditionally-structured data cannot.

Analyzing unstructured data can be best accomplished by following these nine steps:

1. Gather the data

Unstructured data means there are multiple unrelated sources. You need to find the information that needs to be analyzed and pull it together. Make sure the data is relevant so that you can ultimately build correlations.

2. Find a method

You need a method in place to analyze the data and have at least a broad idea of what should be the end result. Are you looking for a sales trend, a more traditional metric, or overall customer sentiment? Create a plan for finding a result and what will be done with the information going forward.

3. Get the right stack

The raw data you pull will likely come from many sources, but the results have to be put into a tech stack or cloud storage in order for them to be operationally useful. Consider the final requirements that you want to achieve and then judge the best stack. Some basic requirements are real-time access and high availability. If you’re running an ecommerce firm, then you want real-time capabilities and also want to be sure you can manage social media on the fly based on trend data.


4. Put the data in a lake

Organizations that want to keep information will typically scrub it and then store it in a data warehouse. This is a clean way to manage data, but in the age of Big Data it removes the chance to find surprising results. The newer technique is to let the data swim in a “data lake” in its native form. If a department wants to perform some analysis, they simply dip into the lake and pull the data. But the original content remains in the lake so future investigations can find correlations and new results.

5. Prep for storage

To make the data useful (while keeping the original in the lake), it is wise to clean it up. For example text files can contain a lot of noise, symbols, or whitespace that should be removed. Dupes and missing values should also be detected so analysis will be more efficient.

6. Find the useful information amongst the clutter

Semantic analysis and natural language processing techniques can be used to pull various phrases as well as the relationship to that phrase. For example “location” can be searched and categorized from speech in order to establish a caller’s location.

7. Build relationships

This step takes time, but it’s where the actionable insights lay. By establishing relationships between the various sources, you can build a more structured database which will have more layers and complexity (in a good way) then a traditional single-source database.

8. Employing statistical modeling

Segmenting and classifying the data comes next. Use tools such as K-means, Naïve Bayes, and Support Vector Machine algorithms to do the heavy lifting to find correlations. You can use sentiment analysis to gauge customer’s moods over time and how they are influenced by product offerings, new customer service channels, and other business changes. Temporal modeling can be applied to social media and forums to find the most relevant topics that are being discussed by your customers. This is valuable information for social media managers who want the brand to stay relevant.

9. End results matter

The end result of all this work has to be condensed down to a simplified presentation. Ideally, the information can be viewed on a tablet or phone and helps the recipient make smart real-time decisions. They won’t see the prior eight steps of work, but the payoff should be in the accuracy and depth of the data recommendations.

Every company’s management is pushing the importance of social media and customer service as the main drivers of company success. However, these services can provide another layer of assistance to firms after diagnostic tools are applied to their underlying data. IT staff need to develop certain skills in order to properly collect, store, and analyze unstructured data in order to compare it with structured data to see the company and its users in a whole new way.


About the author: Salil Godika is Co-Founder, Chief Strategy & Marketing Officer and Industry Group Head at Happiest Minds Technologies. Salil has 18 years of experience in the IT industry across global product and services companies. Prior to Happiest Minds, Salil was with MindTree for 4 years as the Chief Strategy Officer. Before MindTree, Salil spent 12 years in the United States working for start-ups and large technology product companies like Dassault Systems, EMC and i2 Technologies. His accomplishments include incubating a new product to $30million in revenue, successful market positioning of multiple products, global marketing for a $300million business and multiple M&As.

 Related Items:


Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:45 PM
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:45 PM
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:46 PM
  • Xen - Xen is an open source virtual machine monitor for x86-compatible comput...
  • Xubuntu - Ubuntu (pronounced oo-BOON-too) is an open source Debian-based ...
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:47 PM
Picture of System Administrator


by System Administrator - Wednesday, 10 July 2013, 7:48 PM
  • zsh - Zsh is a shell (command interpreter) of the Bourne shell family, which ...


Picture of System Administrator

Decoding DNA: New Twists and Turns (DNA)

by System Administrator - Wednesday, 26 June 2013, 10:19 PM

The Scientist takes a bold look at what the future holds for DNA research, bringing together senior investigators and key leaders in the field of genetics and genomics in this 3-part webinar series.

The structure of DNA was solved on February 28, 1953 by James D. Watson and Francis H. Crick, who recognized at once the potential of DNA's double helical structure for storing genetic information — the blueprint of life. For 60 years, this exciting discovery has inspired scientists to decipher the molecule's manifold secrets and resulted in a steady stream of innovative advances in genetics and genomics.

Honoring our editorial mission, The Scientist will take a bold look at what the future holds for DNA research, brining together senior investigators and key leaders in the field of genetics and genomics in this 3-part webinar series.

What's Next in Next-Generation Sequencing?

Original Broadcast Date: Tuesday March 5, 2013

The advent of Next-Generation Sequencing is considered the most propelling technological advance, which has resulted in  the doubling of sequence data almost every 5 months and the precipitous drop in the cost of sequencing a piece of DNA. The first webinar will track the evolution of next-generation sequencing and explore what the future holds in terms of the technology and its applications.


George Church is a professor of genetics at Harvard Medical School, and Director of the Personal Genome Project, providing the world's only open-access information on human genomic, environmental and trait data (GET).His 1984 Harvard PhD included the first methods for direct genome sequencing, molecular multiplexing, and barcoding. These lead to the first commercial genome sequence (pathogen, Helicobacter pylori) in 1994. Hisinnovations in "next generation" genome sequencing and synthesis and cell/tissue engineering resulted in 12 companies spanning fields including medical genomics (KnomeAlacrisAbVitro,GoodStartPathogenica) and synthetic biology (LS9JouleGen9,WarpDrive) as well as new privacy, biosafet, and biosecurity policies. He is director of the NIH Centers of Excellence in Genomic Science. His honors include election to NAS & NAE and Franklin Bower Laureate for Achievement in Science.

George Weinstock is currently a professor of genetics and of molecular microbiology at Washington University in Saint Louis. He was previously codirector of the Human Genome Sequencing Center at Baylor College of Medicine in Houston, Texas where he was also a professor of molecular and human genetics. Dr. Weinstock received his BS degree from the University of Michigan (Biophysics, 1970) and his PhD from the Massachusetts Institute of Technology (Microbiology, 1977).

Joel Dudley is an assistant professor of genetics and genomic sciences and Director of Biomedical Informatics at Mount Sinai School of Medicine in New York City. His current research is focused on solving key problems in genomic and systems medicine through the development and application of translational and biomedical informatics methodologies. Dudley's published research covers topics in bioinformatics, genomic medicine, personal and clinical genomics, as well as drug and biomarker discovery. His recent work with coauthors describing a novel systems-based approach for computational drug repositioning, was featured in the Wall Street Journal, and earned designation as the NHGRI Director's Genome Advance of the Month. He is also coauthor (with Konrad Karczewski) of the forthcoming book, Exploring Personal Genomics. Dudley received a BS in microbiology from Arizona State University and an MS and PhD in biomedical informatics from Stanford University School of Medicine.

Unraveling the Secrets of the Epigenome

Original Broadcast Date: Thursday April 18, 2013

This second webinar in The Scientist's Decoding DNA series will cover the Secrets of the Epigenome, discussing what is currently known about DNA methylation, histone modifications, and chromatin remodeling and how this knowledge can translate to useful therapies.


Stephen Baylin is a professor of medicine and of oncology at the Johns Hopkins University School of Medicine, where he is also Chief of the Cancer Biology Division of the Oncology Center and Associate Director for Research of The Sidney Kimmel Comprehensive Cancer Center. Together with Peter Jones of the University of Southern California, Baylin also leads the Epigenetic Therapy Stand up to Cancer Team (SU2C). He and his colleagues have fostered the concept that DNA hypermethylation of gene promoters, with its associated transcriptional silencing, can serve as alternatives to mutations for producing loss oftumor-suppressor gene function. Baylin earned both his BS and MD degrees from Duke University, where he completed his internship and first-year residency in internal medicine. He then spent 2 years at the National Heart and Lung Institute of the National Institutes of Health. In 1971, he joined the departments of oncology and medicine at the Johns Hopkins University School of Medicine, an affiliation that still continues.

Victoria Richon heads the Drug Discovery and Preclinical Development Global Oncology Division at Sanofi. Richon joined Sanofi in November 2012 from Epizyme, where she served as vice president of biological sciences beginning in 2008. At Epizyme she was responsible for the strategy and execution of drug discovery and development efforts that ranged from target identification through candidate selection and clinical development, including biomarker strategy and execution. Richon received her BA in chemistry from the University of Vermont and her PhD in biochemistry from the University of Nebraska. She completed her postdoctoral research at Memorial Sloan-Kettering Cancer Center.

Paolo Sassone-Corsi is Donald Bren Professor of Biological Chemistry and Director of the Center for Epigenetics and Metabolism at the University of California, Irvine, School of Medicine. Sassone-Corsi is a molecular and cell biologist who has pioneered the links between cell-signaling pathways and the control of gene expression. His research on transcriptional regulation has elucidated a remarkable variety of molecular mechanisms relevant to the fields of endocrinology, neuroscience, metabolism, and cancer. He received his PhD from the University of Naples and completed his postdoctoral research at CNRS, in Strasbourg, France.

The Impact of Personalized Medicine

Original Broadcast Date: Tuesday May 7, 2013

After the human genome was sequenced, Personalized Medicine became an end goal, driving both academia and the pharma/biotech industry to find and target cellular pathways and drug therapies that are unique to an individual patient. The final webinar in the series will help us better understand The Impact of Personalized Medicine, what we can expect to gain and where we stand to lose.


Jay M. ("Marty") Tenenbaum is founder and chairman of Cancer Commons. Tenenbaum’s background brings a unique perspective of a world-renowned Internet commerce pioneer and visionary. He was founder and CEO of Enterprise Integration Technologies, the first company to conduct a commercial Internet transaction. Tenenbaum joined Commerce One in January 1999, when it acquired Veo Systems. As chief scientist, he was instrumental in shaping the company's business and technology strategies for the Global Trading Web. Tenenbaum holds BS and MS degrees in electrical engineering from MIT, and a PhD from Stanford University.

Amy P. Abernethy, a palliative care physician and hematologist/oncologist, directs both the Center for Learning Health Care (CLHC) in the Duke Clinical Research Institute, and the Duke Cancer Care Research Program (DCCRP) in the Duke Cancer Institute. An internationally recognized expert in health-services research, cancer informatics, and delivery of patient-centered cancer care, she directs a prolific research program (CLHC/DCCRP) which conducts patient-centered clinical trials, analyses, and policy studies. Abernethy received her MD from Duke University School of Medicine.

Geoffrey S. Ginsburgis the Director of Genomic Medicine at the Duke Institute for Genome Sciences & Policy. He is also the Executive Director of the Center for Personalized Medicine at Duke Medicine and a professor of medicine and pathology at Duke University Medical Center. His work spans oncology, infectious diseases, cardiovascular disease, and metabolic disorders. His research is addressing the challenges for translating genomic information into medical practice using new and innovative paradigms, and the integration of personalized medicine into health care. Ginsburg received his MD and PhD in biophysics from Boston University and completed an internal medicine residency at Beth Israel Hospital in Boston, Massachusetts.

Abhijit “Ron” Mazumder obtained his BA from Johns Hopkins University, his PhD from the University of Maryland, and his MBA from Lehigh University. He worked for Gen-Probe, Axys Pharmaceuticals, and Motorola, developing genomics technologies. Mazumder joined Johnson & Johnson in 2003, where he led feasibility research for molecular diagnostics programs and managed technology and biomarker partnerships. In 2008, he joined Merck as a senior director and Biomarker Leader. Mazumder rejoined Johnson & Johnson in 2010 and is accountable for all aspects of the development of companion diagnostics needed to support the therapeutic pipeline, including selection of platforms and partners, oversight of diagnostic development, support of regulatory submissions, and design of clinical trials for validation of predictive biomarkers.


Page: (Previous)   1  ...  5  6  7  8  9  10  11  12  13  14  ...  63  (Next)