Glosario KW | KW Glossary


Ontology Design | Diseño de Ontologías

Browse the glossary using this index

Special | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | ALL

Page: (Previous)   1  2  3  4  5  6  7  8  9  10  ...  22  (Next)
  ALL

D

Picture of System Administrator

D (Storage)

by System Administrator - Friday, 31 May 2013, 10:37 PM
 
Picture of System Administrator

D (WEB SERVICES)

by System Administrator - Saturday, 1 June 2013, 3:11 PM
 
  • DIME (Direct Internet Message Encapsulation) - a specification that defines a format for attaching files to Simple Object Access Protocol (SOAP) messages between application programs over the Internet. DIME is similar to but somewhat simpler than the Internet's MIME protocol.
  • Document Object Model (DOM) - a programming interface from the W3C that lets a programmer create and modify HTML pages and XML documents as full-fledged program objects.
  • DSML (Directory Services Markup Language) - an XML application that enables different computer network directory formats to be expressed in a common format and shared by different directory systems.
  • DXL (Domino Extensible Language) - a specific version of Extensible Markup Language (XML) for Lotus Domino data.
  • DirXML - Novell's directory interchange software that uses XML to keep different directories synchronized.
Picture of System Administrator

D - Vocabulario Popular Montevideano

by System Administrator - Tuesday, 12 January 2016, 2:11 PM
 

MONTEVIDEANOS EN POCAS PALABRAS
(APROXIMACIÓN A NUESTRO LENGUAJE POPULAR)

- D -

  • DADO/A- Persona muy afectuosa y amigable con la que es muy fácil entablar una relación cordial.
  • DANDY- Tipo que cuida mucho su aspecto, tratando de vestir siempre elegantemente y que muchas veces sólo resulta ser una persona superficial en busca de destacarse de las demás.
  • DAR ABASTO- Poder cumplir con algo sin problemas. Tener tiempo y fuerzas suficientes.
  • DAR BOLA- Prestar atención/Dar más importancia de la que tiene.
  • DAR BOLILLA- Dar bola.
  • DAR BOMBA- Fornicar.
  • DAR CHANGÜÍ- Ver: Changüí.
  • DAR COMO ADENTRO DE UN GORRO- Ver: Dar como en bolsa.
  • DAR COMO EN BOLSA- Propinar una feroz paliza a alguien/Castigar.
  • DAR DE LA QUE PICA- Dar bola/Dar pelota.
  • DAR EL BATACAZO- Ver: Batacazo.
  • DAR EL BRAZO A TORCER- Aceptar una equivocación/Reconocer un error.
  • DAR EL CAMPANAZO- Ver: Batacazo.
  • DAR EL DULCE- Engolosinar a alguien con ciertas artimañas para lograr algo que se está buscando de ella.
  • DAR EL PESTO- Ver: Pesto.
  • DAR EN EL CLAVO- Acertar-Embocar/Dar con la clave de algo.
  • DAR GUASCA- Castigar/Usar en demasía/Fornicar.
  • DAR LA BIABA- Propinar una paliza/Darse la biaba: Acicalarse en demasía.
  • DAR LAS DOCE ANTES DE HORA- Se suele decir de la mujer o el hombre muy atractivo/Algo que está más que a punto.
  • DARLE CON UN CAÑO- Con todo/Sin lástima.
  • DARLE DE PUNTA- Darle con un caño.
  • DARLE PA’ TABACO- Reprender o castigar severamente/Suele decirse también respecto a la relación sexual: "Fulanita le dio pa’ tabaco a fulanito...".
  • DARLE PA’ TABACO, HOJILLA Y FÓSFOROS- Lo mismo que la definición anterior, pero más abundante.
  • DARLO POR HECHO- Confiar plenamente en algo que se promete.
  • DARLO POR PERDIDO- Abandonar toda esperanza respecto a algo.
  • DAR MANIJA- Incitar/Alentar a otro para que lleve a cabo cierta cosa.
  • DAR PELOTA- Dar de la que pica.
  • DARSENERO- Simpatizante del Club Atlético River Plate.
  • DAR SIN LÁSTIMA- Propinar una paliza ferozmente/Hacer algo sin ningún tipo de miramientos con respecto a los demás.
  • DAR UNA MANO- Prestar ayuda.
  • DAR UNA MOVIDA- Propinar un castigo.
  • DÁTILES- Dedos.
  • DAVI- Vida.
  • DE A DOS POR EL PASILLO- Indicación del guarda del transporte colectivo para que los pasajeros que van de pie se ubiquen en doble fila permitiendo hacer lugar para que suban otros.
  • DE A PUCHITO- De a poco/Lentamente/En pequeñas cuotas.
  • DE APURO- Rápida e imprevistamente/Sin previo aviso.
  • DE ARCO A ARCO- Juego de los botijas a la pelota entre dos, tirándose el balón uno al otro.
  • DE ARACA- Quedarse sin lograr el propósito buscado. "Lo dejó de araca y se mandó a mudar..."
  • DE ARRIBA- Obtener algo sin gastos propios. De garrón.
  • DE ARRIBOIDE- De arriba.
  • DE BIÓGRAFO- Poco creíble.
  • DEBUTAR- Iniciarse en algo/Generalmente el término se utiliza para la primera experiencia sexual.
  • DEBUTE- Bien, muy bien. Situación inmejorable.
  • DECA- Decadencia.
  • DE CABEZA- Golpear la pelota en el fútbol con la cabeza/Tener a alguien de cabeza: Tenerlo enganchado, subyugado.
  • DECADENTE- Decrépito/Terminado/En bancarrota.
  • DE CAJÓN- ¡Por supuesto!/Lógico.
  • DE COLOR- Eufemismo para referirse a las personas de raza negra.
  • DE COTELETE- De costado/Accidentalmente/De casualidad.
  • DEDAZO- Designar a alguien "a dedo", sin tener en cuenta la opinión de los demás ni siquiera si reunía los méritos para ello.
  • DE DORAPA- Ver: De parado.
  • DEGOLLAR- Cortarle el cuello a alguien/Estafar/Engañar/Abusar.
  • DE GOLPE Y PORRAZO- Sorpresivamente/Cuando nadie lo esperaba.
  • DEJADEZ- Abandono-Decadencia.
  • DEJADO- Abandonado-Decadente.
  • DEJARLA AHÍ NOMÁS- Dar algo por suficientemente discutido. Punto final.
  • DEJARLO SECO- Timar a alguien y sacarle hasta el último peso/Propinar un golpe con suficiente fuerza como para inmovilizar al rival.
  • DEJAR COLGADO- Abandonar sin muchas explicaciones.
  • DEJAR CORRER LA BOLA- No preocuparse en detener los rumores.
  • DEJAR LA VIDA- Hacer un sacrificio supremo.
  • ¡DEJALO PASTAR QUE ENGORDE...!- Permitir que alguien se engolosine en algo que está haciendo para ajustarle cuentas luego.
  • ¡DEJATE DE EMBROMAR!- No molestes más.
  • DEJATE DE JODER- Lo mismo que el término anterior pero dicho más groseramente y en forma más violenta.
  • DEJÓ UN PUEBLO ADENTRO- Se refiere a alguien que estafó a muchas personas, o quedó debiendo cuentas a unos y a otros, imposibles de cobrárselas.
  • DEL TIEMPO DE MARÍA CASTAÑA- Ver: Del tiempo 'el ñaupa.
  • DEL TIEMPO ‘EL ÑAUPA- Cosa muy antigua.
  • DEL OTRO CUADRO- Afeminado.
  • DEL OTRO LADO- "Pasarlo para el otro lado" se dice cuando se produce un homicidio/Suele decirse también de aquella persona que por un golpe de fortuna salió de la miseria.
  • DE MALA BEBIDA- Individuo al que la ingestión de bebidas alcohólicas lo convierte en una persona belicosa.
  • DE MALA GANA- Hacer algo sin entusiasmo/A regañadientes.
  • DE MALA LECHE- De malos sentimientos-Poco confiable.
  • DE MENTIRA- En broma/Simular una situación para engañar a alguien o simplemente para divertirse.
  • DE MI FLOR- Algo excelente, muy bueno/Así dicen los jugadores de truco cuando cantan flor, jugando una carta: "A mi juego me llamaron / y aunque yo no soy doctor / pa’ curarlos he traído / un remedio de mi flor".
  • DENTRE- Tantear buscando la oportunidad en el juego/Buscar la posibilidad del triunfo/Suma de dinero que debe pagarse para anotarse en algo.
  • DE OJITO- Apostar de afuera, sin pagar la apuesta/Conquista amorosa que solo se expresa con la mirada de ambos enamorados sin que ninguno de los dos se anime a tomar la iniciativa.
  • DE PARADO- De pie, generalmente a las apuradas, referido a una relación sexual.
  • DE PEDO- Casualidad/Suerte.
  • DE PELÍCULA- Buenísimo/Increíble.
  • DEPTO- (Porteñismo adoptado). Bulín.
  • DE PREPO- Prepotentemente.
  • DE PURO CULO- De pura suerte.
  • DE RAMBUYÉ- Término popularizado por Alberto Kesmann, relator de partidos de fútbol, para indicar una forma especial de patear la pelota. Aunque nadie sabe exactamente de qué forma se trata, popularmente se dice que algo se hace de "rambuyé" cuando se hace bien y acertadamente.
  • DERECHO- Individuo honesto, recto y cumplidor/Persona que cumple con su palabra y no acepta ni hace trampas.
  • DERECHO VIEJO- Francamente/Sin subterfugios ni rodeos/De frente.
  • DE RECHUPETE- A pedir de boca/ A punto.
  • DERRETIDO/A- Metejón por alguien del sexo opuesto que no se disimula/Acaramelado/Empalagoso.
  • DERRETIRSE- Acaramelarse/Metejonearse.
  • DESATARSE- Liberarse de los prejucios y las inhibiciones/Tirar la chancleta.
  • DESAYUNAR- Avivar a alguien/Informar de algo que el otro no sabía.
  • DESAYUNARSE- Enterarse tardíamente de algo muy importante.
  • DESBANCAR- Sacarle la mujer a otro/Suplantar a alguien.
  • DESCANGAYADO/A- Roto-Maltrecho-En muy mal estado/Persona de juicio escaso.
  • DESCARRILARSE- Salirse de la buena senda/Incurrir en actos fuera de lugar.
  • DESCAROZAR- Ver: Desvirgar.
  • DESCHAVADO- Muy conocido/Individuo del cual todos conocen sus malos hábitos.
  • DESCHAVARSE- Descubrirse/Mostrar la hilacha/Dejar de disimular para demostrar su verdadera personalidad o sus verdaderas intenciones.
  • DESCHAVE- Delación/Mostrarse tal cual se es sin inhibiciones ni tapujos.
  • DESCHAVETADO- Dícese de quien ha perdido el control de sí mismo. Enloquecido. Fuera de sí.
  • DESCONCHE- Lío/Alboroto/Gran joda.
  • DESCUAJARINGADO-DESCUAJERINGADO- Roto/Descompuesto/Deshecho.
  • DESEMBUCHAR- Confesar-Decir todo lo que no se había querido dar a conocer hasta ese momento.
  • DESGRACIARSE- Matar a alguien en una riña sin tener intención de hacerlo/Tirarse un pedo.
  • DESOREJADO/A- Desfachatado.
  • DESPACHADO/A- Liquidado.
  • DESPACHAR- Liquidar.
  • DESPACHARSE- Sacarse el gusto/Hacer algo o decir cosas que hace mucho tiempo tenía ganas de realizar o confesar.
  • DESPATARRADO/A- Caído abruptamente/Tirado en forma grosera/Abandonado.
  • DESPATARRARSE- Caer groseramente/Abandonarse.
  • DESPATARRO- Desorden/Lío.
  • DESPELOTADO- Desenfrenado-Desnudo.
  • DESPELOTE- Desenfreno-Alboroto/Lío mayúsculo-Enredo.
  • DESPIPORRE- Bulla/Lío/Alboroto.
  • DESTORNILLARSE- Perder la cabeza/Perder el juicio/"Destornillarse de risa": Enloquecer riendo.
  • DESVIRGAR- Iniciar sexualmente a alguien.
  • DE UN SAQUE- De primera/Rápidamente/Al primer intento.
  • DE UPA- De arriba.
  • DIEZ Y DIEZ- Se le llama así a ciertas personas con los pies "a lo Chaplín", es decir, torcidos de tal forma que siempre parecen estar indicando las diez y diez del reloj.
  • ¡DIFÍCIL PARA ESCORPIO!- Algo que tiene pocas posibilidades de lograrse.
  • DIFICILONGO- Difícil.
  • ¡DIGA, CHE...!- Forma muy poco elegante de llamar la atención de alguien.
  • DIMES Y DIRETES- Chismes.
  • DIOME- Medio.
  • ¡DIOS MÍO!- Expresión que se dice ante una situación preocupante.
  • ¡DIOS ME LIBRE Y GUARDE!- Era común, principalmente ante los más viejos, expresarse de esta forma y hacerse la cruz simultáneamente al ver a otra persona enferma, lisiada o con alguna deformación.
  • DIQUE- Compadrada/Aparentar.
  • DIQUERO/A- Compadre-Persona que trata de aparentar lo que no es.
  • DOBLAR EL CODO- Llegar a una edad en que ya se empieza a ser veterano/Acecarse a la meta.
  • DOBLETE- Doble.
  • DOGO- Dogomar Martínez. Uno de los más grandes de la historia del boxeo uruguayo.
  • DOGOR- Gordo.
  • DOLCHE FAR NIENTE- Vivir sin hacer nada, dándose la gran vida.
  • DOLOROSA- Cuenta para pagar/ "A ver, mozo... tráigame la dolorosa".
  • DOMINGO SIETE- Exabrupto/"¡Ya tuvo éste que salirme con un domingo siete!".
  • DON- Señor.
  • ¿DÓNDE ESTÁS MATEO QUE NO TE VEO...?- Reclamación con tono humorístico de alguien que está en una rueda de mate y a quien no le llega el suyo desde hace rato.
  • DON JUAN- Individuo que está siempre en plan de conquistas amorosas.
  • DON NADIE- Intrascendente.
  • DON PAULINO- Especie de santo laico de los jubilados. Don Paulino González fue un gran luchador por los derechos de los ancianos y creó una fundación que aún hoy sigue en actividad.
  • DON PEPE- Respetuosamente se recuerda con este apodo a don Pepe Batlle (José Batlle y Ordoñez, ex presidente de la República) y a don Pepe Artigas (don José Gervasio Artigas, heróe máximo de nuestra lucha revolucionaria por la libertad y la dignidad).
  • DOÑA- Señora.
  • DORAR LA PÍLDORA- Halagar astutamente para luego embaucar a alguien.
  • DORIMA- Marido.
  • DORMIDA- Mujer de pocas luces/Tarifa especial por pasar toda la noche en un amoblado/ "La dormida" sale más barato que pagar por horas.
  • DORMIDO- Individuo perezoso, abúlico/Lento para entender.
  • DOS COBRES- "No vale dos cobres", se refiere a persona o cosa de muy poco valor o importancia.
  • DOS DE ORO- Ojos grandes.
  • DOS POR TRES- De vez en cuando-Humorísticamente se suele decir: ¡Dos por tres...llueve...!.
  • DRAGÓN/A- Enamorado que aún no ha concretado su conquista.
  • DRAGONEAR- Demostrar a una persona del sexo opuesto su interés con gestos, miradas y otras insinuaciones/Desear una cosa observándola con sumo interés.
  • DULCE- Se dice de alguien muy tierno y cariñoso: "Fulanito es un dulce..."/Otra acepción: Ver: Dar el dulce.
  • DURO COMO UNA PIEDRA- Duro como piedra- Se refiere al individuo imposible de enternecer o incapaz de demostrar cariño/ Se refiere también a quien enfrenta las situaciones más difíciles sin temores ni titubeos.

Link: http://www.mec.gub.uy/academiadeletras/Boletines/02/martinez4.htm

Picture of System Administrator

DaaS (CLOUD)

by System Administrator - Wednesday, 5 June 2013, 6:49 PM
 

Para cualquier organización es esencial contar con una buena gestión de sus dispositivos. Las tendencias como el Bring your own device (BYOD) y la consumerización IT también ayudan a que estos servicios tomen impulso. En este contexto, las soluciones DaaS posibilitan la gestión de servicios de las PCs de una forma segura con costos reducidos. No está de más recordar que este sistema ofrece portabilidad para que pueda ser administrado desde cualquier lugar y equipo. 

En definitiva, con DaaS pueden ahorrar costos operacionales ya que se reduce la administración del hardware y se descentraliza la gestión. Además, la empresa será más móvil, rápida en gestionar equipos y, por ende, se volverá más productiva.

Picture of System Administrator

Dan Bricklin: Spreadsheet Inventor on a Life in Computing

by System Administrator - Friday, 29 August 2014, 8:32 PM
 

 

Spreadsheet Inventor Dan Bricklin on a Life in Computing

 Posted by Martin Veitch 

If you use spreadsheets — and today the number of users that do so must be in the hundreds of millions — then every time you open a new workbook, edit a cell or calculate a formula, you can thank Dan Bricklin’s legacy. Bricklin, an MIT graduate and Harvard MBA, developed VisiCalc with Bob Frankston back in 1979. The program not only gave rise to many of the elements of modern spreadsheet programs, selling over a million copies along the way, but, after its 1981 port, also helped the IBM Personal Computer become one of the most important new products of the 20thcentury.

Recently I spoke to Bricklin by phone about VisiCalc, its legacy, the rise of the PC generation and what’s happened since.

First, I asked him to sketch a picture of financial management as it was when he wrote the code for what would become VisiCalc.

“For hundreds of years, financial stuff was done on pen and paper and frequently on columns and rows with labels around them. [In the 1970s] we’d be doing it on paper or typing up things. That’s how you kept your books. When they talked about book-keeping, it was exactly that: there were pages in books. Our first name for VisiCalc was Calcu-ledger because that helped explain what we were doing: providing calculations for general ledger.”

Although the spreadsheet made his name, Bricklin had been largely concentrating on another software category that was to change the way the world worked.

“My background was in word processing but, back then, computerised printing of letters was mostly used in things like fundraising where you’d print one letter hundreds of times. The idea of using that equipment for a plain old letter by a typist… they were just too expensive. The idea of a screen-based word processor was a new thing when I was working in the Seventies at Digital Equipment Corporation (DEC) but I had been exposed to systems like [drawing program] Sketchpad which were interactive programs that had started to become popular in the Sixties and Seventies in research institutions. Computers were becoming important in newspapers so a reporter could type something in and see what it would look like but [for the majority of people] the idea of using a computer to do these things was new.”

When Bricklin prototyped VisiCalc, he showed it to his Harvard professor who told him that his competition was calculating on the back of an envelope; if VisiCalc wasn’t faster, people would never use it. That notion helped make Bricklin a pioneer in another way: delivering a user experience (even before the term had been coined) that was intuitive so a new computer user would understand the new electronic tools. So, VisiCalc looked like a ledger book. Similarly, in word processing, manual tools like scissors and glue became ‘cut’ and ‘paste’ features. Add in extra automation capabilities such as having words automatically wrap around lines of text and you had something that was revolutionary, in the days before even Wang and WordStar automated office tasks.

But at the time, computers were rare, pricey and lacking a standard.

“A person being able to buy a computer in a Radio Shack store was a new thing in the Seventies. The only connection most people had to a computer was using automated teller machines. Timesharing with a terminal where you all shared this remote computer was being developed in the Sixties and started to become popular in the Seventies. People were starting to do financial forecasting but that would cost thousands of dollars a month, plus you’d need terminals. For sizeable companies doing MRP [manufacturing resource planning] that was reasonable, but it would cost $5,000 to $10,000 each for a word processing system of letter quality.”

That pioneer environment explains why Bricklin had no great expectations for commercial success with VisiCalc but he was driven by an idea.

“I came from the word processing world and in this What-You-See-Is-What-You-Get world I’d seen a mouse and was familiar with interactive systems. I think it’d seen an Alto [early Xerox computer], played Space War [a game for the DEC PDP-1], seen Sketchpad, knew APL and Basic. The idea of having what we did with words and numbers on paper but with computation seemed pretty obvious; if there was a ‘eureka’ moment, that was it.

“But I was in word processing and did word processing take off like crazy? No. Was it on every desk? No. Today, people hardly know how to write [in longhand] but in those days the idea that computers would be cheap enough... We knew what should be but we also knew from hindsight that acceptance was very slow. I had seen the mouse in the 1970s, it was invented before that and didn’t come into acceptance until the 80s. So although we had something we saw was wonderful, we had no expectations.”

With the benefit of hindsight though, grass shoots and signs were discernible.

“There were people making money out of software. [Mainframe database maker] Cullinane was the first pre-packaged software company to go public so we knew it was possible. But on the PC there were no [commercial] models. Almost nobody knew who Bill Gates was and he was maybe making a few million dollars a year.”

Also, the economics of the day were very different as an Apple II “fully loaded” with daisywheel printer and screen cost about $5,000, the equivalent of about $18,000 today.

This was also a time of scepticism about personal computing with the leading IT suppliers considering it a fad for hobbyists rather than a big opportunity to sell to business users. This attitude was underlined when Bricklin says he considered putting VisiCalc on DEC’s PDP-11 microcomputer before deciding on the Apple II.

“I was thinking about it but the sales person wasn’t very aggressive. It was classical Innovator’s Dilemma. [DEC CEO] Ken Olsen saw PCs as wheelbarrows when he was selling pickup trucks.”

That sort of attitude was unlikely to change Bricklin’s desire to set up his own company with Frankston rather than market his idea to the computing giants of the time.

“I wanted to start a business and be an entrepreneur,” he recalls. “I had taken a few classes at Harvard; there weren’t many in those days but I took those that were on offer.”

Although VisiCalc is sometimes presented as a smash hit that immediately launched the IBM PC, that notion is wrong on three points. VisiCalc was released on the Apple II in 1979, there were other ports before it was made available on the IBM PC, and the initial reaction from the wider world was lukewarm.

“When it first came out, almost nobody but a few people in the computer press wrote about it. There was a humorous article about the National Computer Conference [scene of the VisiCalc launch] in the New York Times where the VisiCalc name was considered funny and the author was making fun of all the computer terms. It then appeared in an announcement about my wedding in the Fall and my father-in-law was able to put some wording in about me being the creator of VisiCalc…

“We had ‘serious volume’ of 1,000 units per month for the first year. That’s nothing, that’s how many copies of a program are downloaded onto iPads every day, or every minute.”

But by comparison to other business software for the personal computer, VisiCalc was a success and the most clued-in sales people at resellers used it to show what personal computers could do.

“They knew that by demonstrating VisiCalc they could sell a PC… it was a killer app. People at HP got it too. One of my classmates at business school worked there and was making a small desktop computer that ran Basic and his boss put VisiCalc on that. Tandy started advertising about VisiCalc and sales started doing a lot better. By the time the IBM PC came out it was understood that it was a good thing and people in the business press started to say ‘can it run VisiCalc?’”

If the laurels are to Bricklin and Frankston for creating the modern spreadsheet from the confluence of the rise in microcomputers, business interest and new software development languages, it was another program and company that cashed in on the full flowering of those trends.

“When Lotus 1-2-3 came out [in 1983], the moon and stars were aligned for [Lotus founder and Bricklin’s friend] Mitch Kapor [to be successful] just as they had been for me to create VisiCalc,” says Bricklin, who adds that he knew the better program when he saw it.

Turn another corner and things could have been different though. Microsoft’s dominance of PC software could have been even greater had it been smarter with its Multiplan product, Bricklin believes.

Had Bricklin been more aggressive and the laws of the day been different, he could have pursued Lotus through the courts for the many features that arguably were derived from VisiCalc.

“The law in the US was that you couldn’t patent software and the chances were one in ten you could try to sneak it through and call it a system,” he recalls.

In truth, Bricklin would make an unlikely litigant and says he never considered such a path. He is proud, rather, that his legacy still looms large, even if he didn’t make the millions that others did. The tech investor Ben Rosen called VisiCalc “the software tail that wags (and sells) the personal computer dog” and there’s no doubt that it played a big part in what happened later to our digitising universe.

While some pioneers skulk and criticise others that followed them and were successful, Bricklin appears to have no trace of bitterness. He remains a staunch fan of Microsoft and Excel, a product that remains a cash machine and still bears the stamp of VisiCalc, 35 years on.

“Doing VisiCalc, I had to come up with the essence of what you need in 32K of RAM and our notion of what was important was correct, it turned out,” he says.

But the power and richness of Excel are remarkable, he says, rejecting the notion that the Redmond company is guilty of creating bloatware.

“Microsoft came from engineers building things: programmers, programmers, programmers — and the hearts and minds of programmers mattered a lot to them. People want to customise things, make it right for what their problem is. It’s the difference between being a carpenter and being an architect — one size does not fit all.

“Microsoft built systems that could be customised, so users could replace that part themselves and it listened to a lot of people and provided what they wanted, all the bells and whistles. People say you end up with bloatware and only 10% of the features get used by any user but that 10% is different for a lot of users. Apple went for smaller number of people and that’s OK because there’s Microsoft for the rest. [Microsoft] had business practices that people didn’t like but is that different than other companies in other industries? Not necessarily.

“As a child of the Sixties I think of Bob Dylan: ‘the loser now will be later to win’. It goes in cycles. The founder of Intel [Andrew Grove] said it: Only the Paranoid Survive and you only have so much time [at the top].”

If Bricklin was before his time with VisiCalc, he was also early onto new trends in user input, creating pen-operated applications at Slate Corporation in the early 1990s and in 2009, a Note Taker app for the “magical” iPad he so admires.

“I decided I wanted to get into that [iOS] world because there were times when I wanted to get something down and if I write 5 and it’s a bit off, that’s OK. But if I did that on a keyboard and it’s a 6, that’s no good for a telephone number. I got to learn what’s it’s like, that world of app stores and so on, and I did all the support so I got to see what people needed.”

That took him to the latest stage of his journey, as CTO for a company specialising in using HTML5 to make software multiplatform.

“I saw businesses were going to replace clipboards with tablets and that’s why I decided to joinAlpha Software. I couldn’t do everything on my own because there’s so much at the back end you need to do but I wanted to innovate at the front end. Being able to customise is something that’s exciting to watch but it takes time. You’d think that in companies with billions of dollars, why would people be carrying around procedure manuals instead of on a digital reader? But they do. [Automating cross-platform capabilities] is extremely important in business. You’d think that most companies would have started taking great advantage of custom mobile opportunities internally, but most haven’t gotten there yet.”  

Bricklin remains awestruck by changes he has seen in a lifetime of computing that has made him a sort of smarter Forrest Gump or Zelig for the binary age — a person who was around at some of the biggest the zeitgeist moments in computing history.

“When I was working in word processing in my early twenties, I was doing programming for a person who worked for Jay Forrester and the year I was born as it turned out, Forrester showed theWhirlwind computer on TV and that was the first time the general public got to see a computer in action, in this video from 1951 [YouTube clip 

]. Bricklin’s boss stayed up for days making sure it was ready for demo and you can see him there in the background [starting at 4.01]. Those were computers that were the size of big rooms and I was working with him on something you could fit in a desk. And now it’s in the pocket and on a watch soon. This is a progression I’ve seen my whole life and it’s a joy each time.”

Bricklin seems content to be recognised as a founding father of the segment, rather than a Rockerfeller.

“If you look at the old basketball players, they didn’t make as much either,” he says, philosophically. “But we wanted to bring computing to more people and we did that.”

Bricklin delights in the fact that science fiction has become reality and that naysayers have been disproved. It gives him pleasure to think that that those who mocked the personal computer as a place to store recipes now Google their ingredients to automatically generate recipes.

“In 2001: A Space Odyssey they’re using a tablet that looks just like an iPad and it’s this magical device. The crystal ball of fiction is now real. In The Wizard of Oz they had this remote thing; the witch could see things at a distance, control things at distance. This is something you can now buy in a store: a drone-controlling iPad. You can communicate with other people in real time and you can control it with a wave of your hands.”

He ponders the rise of the PC and the changes it wrought as people were freed to create, compose, calculate and pay.

“God!” he exclaims, stretching the syllable in wonder, his voice rising to a crescendo. “It was so exciting to see the thing you believe in succeed and to be accepted. My daughter as a youngster once said, ‘Daddy, did they teach you spreadsheets at school?’ and then, after a few seconds corrected herself. ‘Wait a minute…’ That’s really cool to see people use things that we thought should be used. To be vindicated, that was pretty cool.”

Martin Veitch is Editorial Director at IDG Connect

Link: http://www.idgconnect.com/abstract/8571/spreadsheet-inventor-dan-bricklin-life-computing

Picture of System Administrator

Dark Social

by System Administrator - Tuesday, 10 October 2017, 6:30 PM
 

 

Dark Social | Social Oscuro

Publicado por: Margaret Rouse | Contribuidor: Laura Aberle

Traducido automáticamente con Google.

Social oscuro es un término utilizado por los especialistas en marketing y optimización de motores de búsqueda (SEO) para describir las referencias de sitios web que son difíciles de rastrear.

El tráfico social oscuro no parece tener una fuente específica, lo que crea un desafío para las empresas que están tratando de supervisar las referencias de sitios web y la actividad de los medios de comunicación social. La mayoría de las veces el tráfico oscuro es el resultado de la gente que comparte los enlaces del sitio web a través de correo electrónico , mensajes de texto y chats privados Debido a que los vínculos sociales oscuros no tienen código de seguimiento agregado automáticamente a sus URL , no es posible saber cómo el visitante del sitio web encontró el contenido. 

El término "social oscuro" fue acuñado por Alexis C. Madrigal, editor senior de The Atlantic, en un artículo de 2012. Según Madrigal, datos de la firma de analítica Web Chartbeat revelaron que el 56,5% del tráfico social del Atlántico provenía de referencias oscuras. Cuando Chartbeat analizó un conjunto más amplio de sitios web, esa estadística aumentó a casi el 69%. 

Mientras que Madrigal originalmente no creía que las aplicaciones móviles desempeñaran un papel importante en la oscuridad social, en una actualización de 2014 sobre el tema, explicó que las aplicaciones móviles de Facebook, así como otras aplicaciones móviles, parecen estar detrás de la mayor parte del tráfico social oscuro de hoy. Este hallazgo podría representar un problema nuevo: las plataformas de redes sociales como Facebook pueden realmente tener el mayor poder sobre el tráfico social, pero las aplicaciones móviles dificultan el seguimiento y el análisis.

Seguir leyendo Acerca de Dark Social

Link: http://searchcontentmanagement.techtarget.com

El ascenso de la oscuridad social: Todo lo que necesitas saber

por Jack Simpson

Traducido automáticamente con Google.

Se le perdonaría pensar que el término "social oscuro" se refiere a algún tipo de encuentro demoníaco durante el cual los asistentes se deleitan con la sangre para complacer a sus señores. 

Si bien potencialmente molesto para los gestores de medios de comunicación social, oscuro social es algo menos siniestro que el anterior. 

Simplemente se refiere al intercambio social que no puede ser rastreado con precisión, es decir, el material que no es recogido por las plataformas de análisis web. 

En este post voy a explicar más a fondo lo que significa social oscuro, por qué importa, y si hay algo que los vendedores pueden hacer al respecto.  

¿Qué es la oscuridad social?

Si alguien hace clic en un vínculo a su sitio desde una plataforma social abierta como Twitter, Facebook o LinkedIn, su plataforma de análisis le dirá exactamente de dónde proviene esa referencia (en teoría). 

Sin embargo, la gente comparte cada vez más enlaces a través de aplicaciones de mensajería privadas como WhatsApp o Snapchat y continúa compartiendo plataformas como correo electrónico o SMS.   

Piense en ello: encontrará un artículo interesante, simplemente copie y pase el enlace en una aplicación de mensajería y pulse enviar.

Millones de personas lo hacen todos los días, enviando mucho tráfico a los editores. Pero los enlaces compartidos de esta manera carecen de etiquetas de referencia, por lo que cuando el destinatario haga clic en ella su visita se mostrará como tráfico "directo".  

Que es un poco injusto, porque no es realmente tráfico directo, es decir, es poco probable que alguien escriba 'https://econsultancy.com/blog/67108-is-sms-the-most- underrated-and-overlooked-dark-social -channel "en su navegador. 

Pero no se puede esperar con razón una plataforma de análisis para saber la diferencia.

Social oscuro es esencialmente el tráfico que se agrupa en el tráfico directo en su plataforma de análisis, pero en realidad proviene de remisiones untrackable. 

Estos son algunos de los canales responsables del oscuro tráfico social: 

  • Algunas aplicaciones móviles nativas : Facebook, Instagram, etc.
  • Correo electrónico : para proteger la privacidad de los usuarios, las referencias no se pasan.
  • Aplicaciones de mensajería - WhatsApp, WeChat, Facebook Messenger, etc.
  • Navegación segura : si hace clic en HTTPS a HTTP, la referencia no se transmitirá.

¿Por qué eso importa?

Según un estudio de RadiumOne , casi el 70% de todas las referencias en línea provienen de la oscura social a nivel mundial. Para el Reino Unido, esta cifra aumenta hasta el 75%.

Por supuesto que es un estudio a partir de 2014, pero si algo me gustaría argumentar el tema sólo podría haber llegado a ser aún más frecuente desde entonces con el creciente uso de aplicaciones de mensajería privada.

Esto significa que una gran parte del tráfico de referencia es extremadamente difícil de rastrear con precisión, y cualquier cosa que ponga una nube sobre sus datos no es particularmente bienvenida.


Si no tiene la imagen completa, podría terminar perdiendo su tiempo y energía en la optimización de las cosas mal.  

Pero también hay que considerar el valor de este tipo de tráfico. 

Si encuentro un enlace para un producto que sé que mi esposa está buscando, y el correo electrónico que el enlace a ella, es justo decir que es probable que se conviertan . 

El tráfico social oscuro es por lo tanto extremadamente valioso. Es efectivamente boca a boca entre las personas que probablemente se conocen bien (es seguro asumir esto si se están comunicando a través de algo como aplicaciones de mensajería privada o SMS ). 

¿Qué puedes hacer al respecto? 

No podrá realizar un seguimiento completo del tráfico social oscuro, pero hay algunos pasos que puede tomar para reducir las cosas. 

Si observa su tráfico directo en cualquier plataforma de análisis que esté utilizando, es justo decir que los enlaces largos como los que se incluyen en nuestro tráfico directo de Analytics no se escribieron manualmente. 

 oscuro tráfico directo social

Por lo tanto, es seguro asumir, al menos con cierta precisión, que la mayoría de esos vínculos son en realidad de oscuro social. 

Usted podría configurar un segmento en su análisis que tenga en cuenta todos los enlaces de tráfico directo con los parámetros, por lo que para nosotros sería enlaces que no son econsultancy.com, econsultancy.com/blog y así sucesivamente.

Esto le permite obtener una imagen razonablemente precisa de mucho tráfico viene de social oscuro.

Aún no te ayuda en términos de dónde y cómo se compartió ese contenido originalmente, pero te ayudará a explicar la situación cuando tu jefe te está ladrando para explicar de dónde viene todo tu tráfico. 

También debe incluir botones de uso compartido muy visibles en su sitio (incluidos los parámetros de UTM para que pueda rastrearlos) para animar a las personas a compartir contenido utilizando estos en lugar de copiar y pegar el vínculo. 

Esto se reduce a la experiencia del usuario. Hacer los botones de compartir la opción más rápida y más fácil, y ¿por qué nadie no usarlos? 

Pero asegúrese de incluir los botones de uso compartido de correo electrónico, WhatsApp y otros canales sociales oscuros.

Es discutible más importante incluir éstos que los botones tales como Facebook y gorjeo donde usted puede seguir tráfico incluso si el acoplamiento es copiado y pegado. 

Conclusión: seamos honestos, nadie sabe realmente qué diablos hacer al respecto

Estoy siendo un poco facticia con ese subtítulo. Hay algunas conversaciones realmente interesantes sobre el futuro del social oscuro.

Pero todo lo que lee o escucha el consenso parece ser el mismo: puede reducir las cosas y estar consciente de cuánto tráfico social oscuro está recibiendo, pero hasta ahora no he visto una solución convincente para realizar un seguimiento preciso. 

Ahora que el social oscuro parece estar en el radar de todos, sin embargo, imagino que algunas mejores herramientas y técnicas comenzarán a materializarse. Cuando lo hagan voy a estar seguro de escribir sobre ellos. 

Link: https://econsultancy.com

Picture of System Administrator

Data center design standards bodies

by System Administrator - Thursday, 12 March 2015, 7:49 PM
 

Words to go: Data center design standards bodies

 by Meredith Courtemanche

Need a handy reference sheet of the various data center standards organizations? Keep this list by your desk as a reference.

Several organizations produce data center design standards, best practices and guidelines. This glossary lets you keep track of which body produces which standards, and what each acronym means.

Print or bookmark this page for a quick reference of the organizations and associated websites and standards that data center designers and operators need to know.

  • ASHRAEThe American Society of Heating, Refrigerating and Air-Conditioning Engineers produces data center standards and recommendations for heating, ventilation and air conditioning installations. The technical committee develops standards for data centers' design, operations, maintenance and energy efficiency. Data center designers should consult all technical documents from ASHRAE TC 9.9: Mission Critical Facilities, Technology Spaces and Electronic Equipment.www.ashrae.org.
  • BISCI: The Building Industry Consulting Service International Inc. is a global association that covers cabling design and installation. ANSI/BICSI 002-2014, Data Center Design and Implementation Best Practices, covers electrical, mechanical and telecommunications structure in a data center, with comprehensive considerations from fire protection to data center infrastructure managementwww.bicsi.org.
  • BREEAMThe BRE Environmental Assessment Method (BREEAM) is an environmental standard for buildings in the U.K. and nearby countries, covering design, construction and operation. The code is part of a framework for sustainable buildings that takes into account economic and social factors as well as environmental. It is managed by BRE Global, a building science center focused on research and certification.http://www.breeam.org/
  • The Green Grid AssociationThe Green Grid Association is well-known for its PUE metric, defined as power usage effectiveness or efficiency. PUE measures how well data centers use power by a ratio of total building power divided by power used by the IT equipment alone. The closer to 1 this ratio comes, the more efficiently a data center is consuming power. Green Grid also publishes metrics for water (WUE) and carbon (CUE) usage effectiveness based on the same concept. www.thegreengrid.org
  • IDCA: The International Data Center Authority is primarily known as a training institute, but also publishes a holistic data center design and operations ranking system: the Infinity Paradigm. Rankings cover seven layers of data centers, from location and facility through data infrastructure and applications. www.idc-a.org
  • IEEEThe Institute of Electrical and Electronics Engineers provides more than 1,300 standards and projects for various technological fields. Data center designers and operators rely on the Ethernet network cabling standard IEEE 802.3ba, as well as IEEE 802 standards, for local area networks such as IEEE 802.11 wireless LAN specifications. www.ieee.org
  • ISOThe International Organization for Standardization is an overarching international conglomeration of standards bodies. The ISO releases a wide spectrum of data center standards, several of which apply to facilities. ISO 9001 measures companies' quality control capabilities. ISO 27001 certifies an operation's security best practices, regarding physical and data security as well as business protection and continuity efforts. Other ISO standards that data center designers may require include environmental practices, such as ISO 14001 and ISO 50001. www.iso.org
  • LEEDThe Leadership in Energy and Environmental Design is an international certification for environmentally conscious buildings and operations managed by the U.S. Green Building Council. Five rating systems -- building design, operations, neighborhood development and other areas -- award a LEED level -- certified, silver, gold or platinum -- based on amassed credits. The organization provides a data-center-specific project checklist, as the LEED standard includes adaptations for the unique requirements of data centers. www.usgbc.org
  • NFPA: The National Fire Protection Association publishes codes and standards to minimize and avoid damage from hazards, such as fire. No matter how virtualized or cloudified your IT infrastructure, fire regulations still govern your workloads. NFPA 75 and 76 standards dictate how data centers contain cold/cool and hot aisles with obstructions like curtains or walls. NFPA 70 requires an emergency power off button for the data center to protect emergency respondents. www.nfpa.org
  • NIST: The National Institute of Standards and Technology oversees measurements in the U.S. NIST's mission includes research on nanotechnology for electronics, building integrity and diverse other industries. For data centers, NIST offers recommendations on authorization and access. Refer to special publications 800-53, Recommended Security Controls for Federal Information Systems, and SP 800-63, Electronic Authentication Guideline. www.nist.gov
  • OCP: The Open Compute Project is known for its server and network design ideas. But OCP, started by Internet giant Facebook to promote open source in hardware, also branches into data center design. OCP's Open Rack and optical interconnect projects call for 21 inch rack slots and intra-rack photonic connections. OCP's data center design optimizes thermal efficiency with 277 Volts AC power and tailored electrical and mechanical components. www.opencompute.org
  • OIX: The Open IX Association focuses on Internet peering and interconnect performance from data centers and network operators, along with the content creators, distribution networks and consumers. It publishes technical requirements for Internet exchange points and data centers that support them. The requirements cover designed resiliency and safety of the data center, as well as connectivity and congestion management.www.open-ix.org
  • Telcordia: Telcordia is part of Ericsson, a communications technology company. The Telcordia GR-3160 Generic Requirements for Telecommunications Data Center Equipment and Spaces particularly relates to telecommunications carriers, but the best practices for network reliability and organizational simplicity can benefit any data center that delivers applications to end users or host applications for third-party operators. The standard deals with environmental protection and testing for hazards, ranging from earthquakes to lightning surges. www.ericsson.com
  • TIA: The Telecommunications Industry Association produces communications standards that target reliability and interoperability. The group's primary data center standard, ANSI/TIA-942-A, covers network architecture and access security, facility design and location, backups and redundancy, power management and more. TIA certifies data centers to ranking levels on TIA-942, based on redundancy in the cabling system.www.tiaonline.org

 

  • The Uptime InstituteThe Uptime Institute certifies data center designs, builds and operations on a basis of reliable and redundant operating capability to one of four tier levels. Data center designers can certify plans; constructed facilities earn tier certification after an audit; operating facilities can prove fault tolerance and sustainable practices. Existing facilities, which cannot be designed to meet tier level certifications, can still obtain theManagement Operations Stamp of Approval from Uptime.www.uptimeinstitute.com

Next Steps

 

 

 

Picture of System Administrator

Data Center Efficiency

by System Administrator - Wednesday, 26 August 2015, 7:17 PM
 

eGuide: Data Center Efficiency

APC by Schneider Electric

Data center efficiency is one of the cornerstones of an effective IT infrastructure. Data centers that deliver energy efficiency, high availability, density, and scalability create the basis for well-run IT operations that fuel the business. With the right approach to data center solutions, organizations have the potential to significantly save on costs, reduce downtime, and allow for future growth.

In this eGuide, Computerworld, CIO, and Network World examine recent trends and issues related to data center efficiency. Read on to learn how a more efficient data center can make a difference in your organization.

Please read the attached eGuide.

Picture of System Administrator

Data Confabulation

by System Administrator - Tuesday, 12 May 2015, 12:30 AM
 

Data Confabulation

Posted by: Margaret Rouse

Data confabulation is a business intelligence term for the selective and possibly misleading use of data to support a decision that has already been made.

Within the volumes of big data there are often a lot of small bits of evidence that are contradictory to even clearly data-supported facts. Generally, this data noise can be seen as such and, in the context of the body of data, it is clearly outweighed. When data is selectively chosen from vast sources, however, a picture can often be created to support a desired view, decision or argument that would not be supported by a more rigorously controlled method.

Data confabulation can be used both intentionally and unintentionally to promote the user’s viewpoint. When a decision is made before data is examined, there is a danger of falling prey to confirmation bias even when people are trying to be honest. The term confabulation comes from the field of psychology, where it refers to the tendency of humans to selectively remember, misinterpret or create memories to support a decision, belief or sentiment.

Related Terms

Definitions

  • de-anonymization (deanonymization)

    - De-anonymization is a method used to detect the original data that was subjected to processes to make it impossible -- or at least harder -- to identify the personally identifiable information (PII... (WhatIs.com)

  • data anonymization

    - The purpose of data anonymization is to make its source untraceable. Data anonymization processes include encryption, substitution, shuffing, number and data variance and nulling out data. (WhatIs.com)

  • change management

    - Change management is a systematic approach to dealing with change, both from the perspective of an organization and on the individual level. (SearchCIO.com)

Glossaries

  • Business intelligence - business analytics

    - Terms related to business intelligence, including definitions about business analytics and words and phrases about gathering, storing, analyzing and providing access to business data.

  • Internet applications

    - This WhatIs.com glossary contains terms related to Internet applications, including definitions about Software as a Service (SaaS) delivery models and words and phrases about web sites, e-commerce ...

Picture of System Administrator

Data Exhaust

by System Administrator - Tuesday, 12 May 2015, 12:26 AM
 

Data Exhaust

Posted by: Margaret Rouse

Data exhaust is the data generated as a byproduct of people’s online actions and choices.

Data exhaust consists of the various files generated by web browsers and their plug-ins such as cookies, log files, temporary internet files and and .sol files (flash cookies). In its less hidden and more legitimate aspect, such data is useful to improve tracking trends and help websites serve their user bases more effectively. Studying data exhaust can also help improve user interface and layout design. As these files reveal the specific choices an individual has made, they are very revealing and are a highly-sought source of information for marketing purposes. Websites store data about people’s actions to maintain user preferences, among other purposes. Data exhaust is also used for the lucrative but privacy-compromising purposes of user tracking for research and marketing.

Data exhaust is named for the way it streams out behind the web user similarly to the way car exhaust streams out behind the motorist. An individual’s digital footprint, sometimes known as a digital dossier, is the body of data that exists as a result of actions and communications online that can in some way be traced back to them. That footprint is broken down as active and passive data traces; digital exhaust consists of the latter. In contrast with the data that people consciously create, data exhaust is unintentionally generated and people are often unaware of it.

Security and privacy software makers struggle with the conflicting goals of marketing and privacy. User software designed to protect security and privacy often disrupts online marketing and research business models. While new methods of persistently storing tracking data are always in development, software vendors constantly design new methods to remove them.

See Michelle Clark's TEDx talk about digital footprints:

Link: http://whatis.techtarget.com

Picture of System Administrator

Data Lake

by System Administrator - Thursday, 25 June 2015, 10:29 PM
 

 

Author: John O’Brien

It would be an understatement to say that the hype surrounding the data lake is causing confusion in the industry. Perhaps, this is an inherent consequence of the data industry's need for buzzwords: it's not uncommon for a term to rise to popularity long before there is clear definition and repeatable business value. We have seen this phenomena many times when concepts including "big data," "data reservoir," and even the "data warehouse" first emerged in the industry. Today's newcomer to the data world vernacular—the "data lake"—is a term that has endured both the scrutiny of pundits who harp on the risk of digging a data swamp and, likewise, the vision of those who see the potential of the concept to have a profound impact on enterprise data architecture. As the data lake term begins to come off its hype cycle and face the pressures of pragmatic IT and business stakeholders, the demand for clear data lake definitions, use cases, and best practices continues to grow.

This paper aims to clarify the data lake concept by combining fundamental data and information management principles with the experiences of existing implementations to explain how current data architectures will transform into a modern data architecture. The data lake is a foundational component and common denominator of the modern data architecture enabling, and complementing specialized components, such as enterprise data warehouses, discovery-oriented environments, and highly-specialized analytic or operational data technologies within or external to the Hadoop ecosystem. Therefore, the data lake has become the metaphor for the transformation of enterprise data management, and will continue to evolve the data lake definition according to established principles, drivers, and best practices that will quickly emerge as hindsight is applied at companies.

Please read the attached guide.

 

Picture of System Administrator

Data Profiling

by System Administrator - Tuesday, 30 December 2014, 3:24 PM
 

Data Profiling

Posted by Margaret Rouse

Data profiling, also called data archeology, is the statistical analysis and assessment of data values within a data set for consistency, uniqueness and logic.

The data profiling process cannot identify inaccurate data; it can only identify  business rules violations and anomalies.The insight gained by data profiling can be used to determine how difficult it will be to use existing data for other purposes.  It can also be used to provide metrics to assess data quality and help determine whether or not metadata accurately describes the source data. 

Profiling tools evaluate the actual content, structure and quality of the data by exploring relationships that exist between value collections both within and across data sets. For example, by examining the frequency distribution of different values for each column in a table, an analyst can gain insight into the type and use of each column. Cross-column analysis can be used to expose embedded value dependencies and inter-table analysis allows the analyst to discover overlapping value sets that represent foreign keyrelationships between entities.  

See also: data modelingdata dictionarydata deduplication

Link: http://searchdatamanagement.techtarget.com

Picture of System Administrator

Data Silo

by System Administrator - Monday, 20 July 2015, 4:59 PM
 

Data Silo

Posted by Margaret Rouse

A data silo is a repository of fixed data that an organization does not regularly use in its day-to-day operation.

A data silo is a repository of fixed data that an organization does not regularly use in its day-to-day operation. So-called siloed data cannot exchange content with other systems in the organization. The expressions "data silo" and "siloed data" arise from the inherent isolation of the information. The data in a silo remains sealed off from the rest of the organization, like grain in a farm silo is closed off from the outside elements.

In recent years, data silos have faced increasing criticism as an impediment to productivity and a danger to data integrity. Data silos also increase the risk that current (or more recent) data will accidentally get overwritten with outdated (or less recent) data. When two or more silos exist for the same data, their contents might differ, creating confusion as to which repository represents the most legitimate or up-to-date version.

Cloud-based data, in contrast to siloed data, can continuously evolve to keep pace with the needs of an organization, its clients, its associates, and its customers. For frequently modified information, cloud backup offers a reasonable alternative to data silos, especially for small and moderate quantities of data. When stored information does not need to be accessed regularly or frequently, it can be kept in a single cloud archive rather than in multiple data silos, ensuring data integration (consistency) among all members and departments in the organization. For these reasons, many organizations have begun to move away from data silos and into cloud-based backup and archiving solutions.

Continue Reading About data silo

Link: http://searchcloudapplications.techtarget.com

 

Picture of System Administrator

Database-as-aService (DBaaS)

by System Administrator - Monday, 16 February 2015, 3:42 PM
 

Why Database-as-aService (DBaaS)?

IBM Cloudant manages, scales and supports your fast-growing data needs 24x7, so you can stay focused on new development and growing your business.

Fully managed, instantly provisioned, and highly available

In a large organization, it can take several weeks for a DBMS instance to be provisioned for a new development project, which limits innovation and agility. Cloudant DBaaS helps to enable instant provisioning of your data layer, so that you can begin new development whenever you need. Unlike Do-It-Yourself (DIY) databases, DBaaS solutions like Cloudant provide specific levels of data layer performance and up time. The managed DBaaS capability can help reduce risk of service delivery failure for you and your projects.

Build more. Grow more

With a fully managed NoSQL database service, you do not have to worry about the time, cost and complexity associated with database admnistration, architecture and hardware. Now you can stay focused on developing new apps and growing your business to new heights.

Who uses DBaaS?

Companies of all sizes, from start ups to mega-users use Cloudant to manage data for large or fast-growing web and mobile apps in ecommerce, on-line education, gaming, financial services, and other industries. Cloudant is best suited for applications that need a database to handle a massively concurrent mix of low-latency reads and writes. Its data replication and synchronization technology also enables continuous data availability, as well as off-line app usage for mobile or remote users.

As a JSON document store, Cloudant is ideal for managing multi- or un-structured data. Advanced indexing makes it easy to enrich applications with location-based (geo-spatial) services, full-text search, and near real-time analytics.

Please read the attached whitepaper.

Picture of System Administrator

Decoding DNA: New Twists and Turns (DNA)

by System Administrator - Wednesday, 26 June 2013, 10:19 PM
 

The Scientist takes a bold look at what the future holds for DNA research, bringing together senior investigators and key leaders in the field of genetics and genomics in this 3-part webinar series.

The structure of DNA was solved on February 28, 1953 by James D. Watson and Francis H. Crick, who recognized at once the potential of DNA's double helical structure for storing genetic information — the blueprint of life. For 60 years, this exciting discovery has inspired scientists to decipher the molecule's manifold secrets and resulted in a steady stream of innovative advances in genetics and genomics.

Honoring our editorial mission, The Scientist will take a bold look at what the future holds for DNA research, brining together senior investigators and key leaders in the field of genetics and genomics in this 3-part webinar series.

What's Next in Next-Generation Sequencing?


Original Broadcast Date: Tuesday March 5, 2013
VIEW THE VIDEO NOW

The advent of Next-Generation Sequencing is considered the most propelling technological advance, which has resulted in  the doubling of sequence data almost every 5 months and the precipitous drop in the cost of sequencing a piece of DNA. The first webinar will track the evolution of next-generation sequencing and explore what the future holds in terms of the technology and its applications.

Panelists:

George Church is a professor of genetics at Harvard Medical School, and Director of the Personal Genome Project, providing the world's only open-access information on human genomic, environmental and trait data (GET).His 1984 Harvard PhD included the first methods for direct genome sequencing, molecular multiplexing, and barcoding. These lead to the first commercial genome sequence (pathogen, Helicobacter pylori) in 1994. Hisinnovations in "next generation" genome sequencing and synthesis and cell/tissue engineering resulted in 12 companies spanning fields including medical genomics (KnomeAlacrisAbVitro,GoodStartPathogenica) and synthetic biology (LS9JouleGen9,WarpDrive) as well as new privacy, biosafet, and biosecurity policies. He is director of the NIH Centers of Excellence in Genomic Science. His honors include election to NAS & NAE and Franklin Bower Laureate for Achievement in Science.

George Weinstock is currently a professor of genetics and of molecular microbiology at Washington University in Saint Louis. He was previously codirector of the Human Genome Sequencing Center at Baylor College of Medicine in Houston, Texas where he was also a professor of molecular and human genetics. Dr. Weinstock received his BS degree from the University of Michigan (Biophysics, 1970) and his PhD from the Massachusetts Institute of Technology (Microbiology, 1977).

Joel Dudley is an assistant professor of genetics and genomic sciences and Director of Biomedical Informatics at Mount Sinai School of Medicine in New York City. His current research is focused on solving key problems in genomic and systems medicine through the development and application of translational and biomedical informatics methodologies. Dudley's published research covers topics in bioinformatics, genomic medicine, personal and clinical genomics, as well as drug and biomarker discovery. His recent work with coauthors describing a novel systems-based approach for computational drug repositioning, was featured in the Wall Street Journal, and earned designation as the NHGRI Director's Genome Advance of the Month. He is also coauthor (with Konrad Karczewski) of the forthcoming book, Exploring Personal Genomics. Dudley received a BS in microbiology from Arizona State University and an MS and PhD in biomedical informatics from Stanford University School of Medicine.

Unraveling the Secrets of the Epigenome

Original Broadcast Date: Thursday April 18, 2013
VIEW THE VIDEO NOW

This second webinar in The Scientist's Decoding DNA series will cover the Secrets of the Epigenome, discussing what is currently known about DNA methylation, histone modifications, and chromatin remodeling and how this knowledge can translate to useful therapies.

Panelists:

Stephen Baylin is a professor of medicine and of oncology at the Johns Hopkins University School of Medicine, where he is also Chief of the Cancer Biology Division of the Oncology Center and Associate Director for Research of The Sidney Kimmel Comprehensive Cancer Center. Together with Peter Jones of the University of Southern California, Baylin also leads the Epigenetic Therapy Stand up to Cancer Team (SU2C). He and his colleagues have fostered the concept that DNA hypermethylation of gene promoters, with its associated transcriptional silencing, can serve as alternatives to mutations for producing loss oftumor-suppressor gene function. Baylin earned both his BS and MD degrees from Duke University, where he completed his internship and first-year residency in internal medicine. He then spent 2 years at the National Heart and Lung Institute of the National Institutes of Health. In 1971, he joined the departments of oncology and medicine at the Johns Hopkins University School of Medicine, an affiliation that still continues.

Victoria Richon heads the Drug Discovery and Preclinical Development Global Oncology Division at Sanofi. Richon joined Sanofi in November 2012 from Epizyme, where she served as vice president of biological sciences beginning in 2008. At Epizyme she was responsible for the strategy and execution of drug discovery and development efforts that ranged from target identification through candidate selection and clinical development, including biomarker strategy and execution. Richon received her BA in chemistry from the University of Vermont and her PhD in biochemistry from the University of Nebraska. She completed her postdoctoral research at Memorial Sloan-Kettering Cancer Center.

Paolo Sassone-Corsi is Donald Bren Professor of Biological Chemistry and Director of the Center for Epigenetics and Metabolism at the University of California, Irvine, School of Medicine. Sassone-Corsi is a molecular and cell biologist who has pioneered the links between cell-signaling pathways and the control of gene expression. His research on transcriptional regulation has elucidated a remarkable variety of molecular mechanisms relevant to the fields of endocrinology, neuroscience, metabolism, and cancer. He received his PhD from the University of Naples and completed his postdoctoral research at CNRS, in Strasbourg, France.

The Impact of Personalized Medicine


Original Broadcast Date: Tuesday May 7, 2013
VIEW THE VIDEO NOW

After the human genome was sequenced, Personalized Medicine became an end goal, driving both academia and the pharma/biotech industry to find and target cellular pathways and drug therapies that are unique to an individual patient. The final webinar in the series will help us better understand The Impact of Personalized Medicine, what we can expect to gain and where we stand to lose.

Panelists:

Jay M. ("Marty") Tenenbaum is founder and chairman of Cancer Commons. Tenenbaum’s background brings a unique perspective of a world-renowned Internet commerce pioneer and visionary. He was founder and CEO of Enterprise Integration Technologies, the first company to conduct a commercial Internet transaction. Tenenbaum joined Commerce One in January 1999, when it acquired Veo Systems. As chief scientist, he was instrumental in shaping the company's business and technology strategies for the Global Trading Web. Tenenbaum holds BS and MS degrees in electrical engineering from MIT, and a PhD from Stanford University.

Amy P. Abernethy, a palliative care physician and hematologist/oncologist, directs both the Center for Learning Health Care (CLHC) in the Duke Clinical Research Institute, and the Duke Cancer Care Research Program (DCCRP) in the Duke Cancer Institute. An internationally recognized expert in health-services research, cancer informatics, and delivery of patient-centered cancer care, she directs a prolific research program (CLHC/DCCRP) which conducts patient-centered clinical trials, analyses, and policy studies. Abernethy received her MD from Duke University School of Medicine.

Geoffrey S. Ginsburgis the Director of Genomic Medicine at the Duke Institute for Genome Sciences & Policy. He is also the Executive Director of the Center for Personalized Medicine at Duke Medicine and a professor of medicine and pathology at Duke University Medical Center. His work spans oncology, infectious diseases, cardiovascular disease, and metabolic disorders. His research is addressing the challenges for translating genomic information into medical practice using new and innovative paradigms, and the integration of personalized medicine into health care. Ginsburg received his MD and PhD in biophysics from Boston University and completed an internal medicine residency at Beth Israel Hospital in Boston, Massachusetts.

Abhijit “Ron” Mazumder obtained his BA from Johns Hopkins University, his PhD from the University of Maryland, and his MBA from Lehigh University. He worked for Gen-Probe, Axys Pharmaceuticals, and Motorola, developing genomics technologies. Mazumder joined Johnson & Johnson in 2003, where he led feasibility research for molecular diagnostics programs and managed technology and biomarker partnerships. In 2008, he joined Merck as a senior director and Biomarker Leader. Mazumder rejoined Johnson & Johnson in 2010 and is accountable for all aspects of the development of companion diagnostics needed to support the therapeutic pipeline, including selection of platforms and partners, oversight of diagnostic development, support of regulatory submissions, and design of clinical trials for validation of predictive biomarkers.

Link: http://www.the-scientist.com//?articles.view/articleNo/33846/title/Decoding-DNA--New-Twists-and-Turns/

Picture of System Administrator

Delivering Data Warehousing as a Cloud Service

by System Administrator - Wednesday, 8 July 2015, 9:27 PM
 

Delivering Data Warehousing as a Cloud Service

The current data revolution has made it an imperative to provide more people with access to data-driven insights faster than ever before. That's not news. But in spite of that, current technology seems almost to exist to make it as hard as possible to get access to data.

That's certainly the case for conventional data warehouse solutions, which are so complex and inflexible that they require their own teams of specialists to plan, deploy, manage, and tune them. By the time the specialists have finished, it's nearly impossible for the actual users to figure out how to get access to the data they need.

Newer 'big data' solutions do not get rid of those problems. They require new skills and often new tools as well, making them dependent on hard-to-find operations and data science experts.

Please read the attached whitepaper.

Picture of System Administrator

Designing and Building an Open ITOA Architecture

by System Administrator - Tuesday, 16 June 2015, 10:51 PM
 

Designing and Building an Open ITOA Architecture

This white paper provides a roadmap for designing and building an open IT Operations Analytics (ITOA) architecture. You will learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets. After weighing the role of each IT data source for your organization, you can learn how to combine them in an open ITOA architecture that avoids vendor lock-in, scales out cost-effectively, and unlocks new and unanticipated IT and business insights.
Please read the attached whitepaper.
Picture of System Administrator

Designing For DevOps

by System Administrator - Monday, 7 August 2017, 2:16 PM
 

Designing For DevOps

Sponsored by Stackify

DevOps first started as a movement around 2008 and it has grown rapidly over the last several years. In our Designing for DevOps guide, we share...

Please read the attached whitepaper...

Picture of System Administrator

Desktop as a Service (DaaS)

by System Administrator - Wednesday, 11 November 2015, 6:29 PM
 

Desktop as a Service (DaaS)

Posted by Margaret Rouse

Desktop as a Service (DaaS) is a cloud service in which the back-end of a virtual desktop infrastructure (VDI) is hosted by a cloud service provider.

DaaS has a multi-tenancy architecture and the service is purchased on a subscription basis. In the DaaS delivery model, the service provider manages the back-end responsibilities of datastoragebackup, security and upgrades. Typically, the customer's personal data is copied to and from the virtual desktop during logon/logoff and access to the desktop is device, location and network independent. While the provider handles all the back-end infrastructure costs and maintenance, customers usually manage their own desktop images, applications and security, unless thosedesktop management services are part of the subscription.

Desktop as a Service is a good alternative for a small or mid-size businesses (SMBs) that want to provide their end users with the advantages a virtual desktop infrastructure offers, but find that deploying a VDI in-house to be cost-prohibitive in terms of budget and staffing.

This definition is part of our Essential Guide: What you need to know about cloud desktops and DaaS providers

Link: http://searchvirtualdesktop.techtarget.com

Picture of System Administrator

Desktop Virtualization Security

by System Administrator - Tuesday, 16 September 2014, 12:43 AM
 

Top 10 reasons to strengthen information security with desktop virtualization

Regain control and reduce risk without sacrificing business productivity and growth

  • New ways of working call for new ways of managing risk. Mobility, flexwork, bring-your-own device (BYOD)  and increased collaboration across organizations have changed the risk profile and undermine existing IT architectures. The challenge is to allow people the flexibility they need for optimal business productivity while ensuring the security and compliance required by the enterprise.
  • Both IT and the business are demanding more of their networks. But networks designed to simply forward packets don't have the capability or the intelligence to understand these high-level, application-related demands. Networks need to change, as does the way IT thinks about them and manages them. In this white paper, see how enterprises can accommodate today's needs while laying the groundwork for supporting tomorrow's more-advanced software-defined networks.

Please read the attached whitepapers.

Picture of System Administrator

Development testing for C# Applications

by System Administrator - Wednesday, 26 August 2015, 3:16 PM
 

Development testing for C# Applications

Static analysis shouldn't be about finding loads of coding style or standard issues. It should be focused on finding the most critical defects. Although traditional byte code analysis solutions such as FxCop are useful, they can miss critical, crash causing defects - plus produce a large set of coding style issues, which can slow down the development team. Learn how the Coverity Development Testing Platform can help you:

  • Find and fix resource leaks, concurrency problems and null references within Visual Studio
  • Eliminate defects such as inconsistent indention issues and copy paste errors that can only be found by understanding the intent of the programmer through source code analysis
  • Understand the impact of change to better prioritize and focus your automated testing efforts

Please read the attached whitepaper.

 

Picture of System Administrator

DevOps

by System Administrator - Wednesday, 15 February 2017, 7:07 PM
 

DevOps

How to utilize it in your IT workspace

by TechTarget

Please read the attached whitepaper.

 

Picture of System Administrator

DevOps (PMI)

by System Administrator - Monday, 29 December 2014, 5:45 PM
 

Definición de DevOps: mejor explicamos lo que no es

by Jennifer Lent

Mucho se ha escrito acerca de lo que es DevOps: Un camino para que los desarrolladores y directores de operaciones colaboren; un conjunto de mejores prácticas para la gestión de aplicaciones en la nube; una idea Ágil que se basa en la integración continua, lo que permite frecuentes liberaciones de código.

Según Wikipedia: "DevOps es un acrónimo inglés de development (desarrollo) y operations (operaciones), que se refiere a una metodología de desarrollo de software que se centra en la comunicación, colaboración e integración entredesarrolladores de software y los profesionales de operaciones en las tecnologías de la información (IT). DevOps es una respuesta a la interdependencia del desarrollo de software y las operaciones IT. Su objetivo es ayudar a una organización a producir productos y servicios software rápidamente. Las empresas con entregas (releases) muy frecuentes podrían requerir conocimientos de DevOps. Flickr desarrolló un sistema DevOps para cumplir un requisito de negocio de diez despliegues al día. A este tipo de sistemas se les conoce como despliegue continuo (continuous deployment) o entrega continua (continuous delivery), y suelen estar asociados a metodologías lean startup. Grupos de trabajo, asociaciones profesionales y blogs usan el término desde 2009."

La definición de DevOps cubre todas estas cosas y más. Pero dado que el término ha adquirido estatus de palabra de moda, puede ser más interesante preguntarse no lo que es DevOps, sino lo que no es. En este artículo, SearchSoftwareQuality preguntó a algunos profesionales de software exactamente eso. He aquí lo que dijeron.

1. DevOps no es un puesto de trabajo.

Publicaciones en sitios de empleo sugieren otra cosa, pero DevOps no es un puesto de trabajo, dijo el consultor de Agile, Scott Ambler. "¿Gestor de DevOps? No sé lo que es eso". DevOps no debe ser un rol laboral, dijo. "DevOps se trata de que los desarrolladores entiendan la realidad de las operaciones y de que el equipo de operaciones comprenda lo que involucra el desarrollo." DevOps, el concepto, es un aspecto importante del desarrollo y la entrega de software, dijo Ambler. "Pero el puesto de DevOps es un síntoma de que las organizaciones que contratan [gerentes de DevOps] no entienden lo que DevOps es realmente. Ellos no lo entienden todavía."

La postura de Ambler sobre DevOps va en contra de la sabiduría convencional. DevOps apareció en la lista de 10 títulos de trabajo que es probable encontrar, de acuerdo con SearchCIO.com.

2. DevOps no es una categoría de herramienta de software.

DevOps no se trata de herramientas, sino de cultura, dijo Patrick Debois en unapresentación titulada “DevOps: tonterías, herramientas y otras cosas inteligentes”, durante la Conferencia GOTO. Debois, quien acuñó el término "DevOps" y fundó una conferencia conocida como DevOpsDays, dijo que las herramientas juegan un papel importante en el apoyo al enfoque de DevOps para la entrega y la gestión de software, pero DevOps no se trata de las herramientas en sí.

Ambler dijo la noción de que hay "herramientas que hacen que DevOps" refleje la realidad actual: DevOps, la palabra de moda, todavía se está moviendo hacia el pico de la curva. "Cada herramienta es una herramienta DevOps", agregó que mientras los vendedores de software continúan empujando sus visiones de DevOps, "mucha de la discusión es ingenua."

3. DevOps no se trata de resolver un problema de TI.

A pesar de sus muchos significados, DevOps es ampliamente entendido como una forma de resolver un problema de TI: permite que desarrollo y operaciones colaboren en la entrega de software. Pero ese no es su objetivo final, dijo Damon Edwards, socio gerente de consultoría de TI, Soluciones DTO, en Redwood City, California. "El punto de DevOps es permitirle a su empresa reaccionar ante las fuerzas del mercado lo más rápido, eficiente y confiable como sea posible. Sin el negocio, no hay otra razón para que estemos hablando de problemas DevOps, mucho menos pasar tiempo resolviéndolos", escribió Edwards escribió en su blog.

Kevin Parker, experto de SearchSoftwareQuality, dijo que el nuevo reto que encaran los gerentes de DevOps es toda la atención que el tema obtiene por parte del negocio. "Lo que antes era una tarea arcana, de elaborada coordinación y gestión de proyectos es ahora en parte diplomacia, parte protector –y una buena cantidad de innovación."

4. DevOps no es sinónimo de integración continua.

DevOps se originó en Agile como una forma de apoyar la práctica ágil de liberaciones de código más frecuentes. Pero DevOps es más que eso, dijo Ambler. "El hecho de que se practique la integración continua no significa que se está haciendo DevOps." Él ve a los gerentes de operaciones como los principales interesados ​​que los equipos ágiles necesitan trabajar para liberar software.

 

5. DevOps no... desaparecerá.

A pesar de las ideas falsas a su alrededor, DevOps está aquí para quedarse y sigue siendo importante para la entrega exitosa de software. "Tanto si lo llamamos DevOps o no, la gestión de cambios y versiones está experimentando una [exponencial] expansión en importancia", dijo Parker. Hay sustancia de fondo en DevOps, añadió el analista de Ovum Michael Azoff. "Por supuesto, hay expectación en torno a DevOps. Todavía estamos en la primera fase. Es donde Agile se ubicaba hace un par de años."

Please read the attached whitepaper: "Top tips for DevOps testing: Achieve continuous delivery"

Más noticias y tutorials:

Link: http://searchdatacenter.techtarget.com

 

Picture of System Administrator

DevSecOps

by System Administrator - Friday, 27 February 2015, 11:23 AM
 

 

Gartner: DevOps is good; DevSecOps is better

by Nicole Laskowski

Make way for DevSecOps. According to Gartner analyst David Cearley, CIOs need to add security professionals to their DevOps teams.

DevOps, or the blending of an enterprise's applications development and systems operations teams, has become a trendy IT topic. The new operating model is often employed in conjunction with Agile software development methods and leverages the scalability of cloud computing -- all in the interest of making companies more nimble and competitive. But, according to one expert, the approach as it is typically practiced today doesn't go far enough.

David Cearley, an analyst at Gartner Inc., believes today's CIOs need to revise DevOps to include security. He calls it DevSecOps. "It's development, it's security, it's operations operating as a dynamic force to create solutions," he said.

Investing in firewalls and perimeter defense isn't bad per se, Cearley said. But with high profile breaches at Target, Home Depot and Sony that left these organizations (among others) with black eyes, it's clear that simply guarding the borders is not enough. By adding security to a DevOps program, CIOs and their teams will be forced to think about security in a more granular way -- at the start of the software development process, rather than as an afterthought.

 

David Cearley

Adding security to DevOps, in classic IT language, turns out to be a people and process problem more than a technology problem. For many organizations, these teams work in separate closets "that don't even have a common wall between them," Cearley said. Still, getting everyone in the same room will be easier than getting everyone on the same page. Luckily, most enterprises have a person uniquely suited to break down cultural barriers and demand that security become a DevOps best practice, Cearley argued: the CIO.

"The CIO is the only one [who] is in a position to do something about this because the security team reports to him, the operations team reports to him, the applications team reports to him and the architecture team reports to him," he said. "The CIO is the leader; the CIO has to direct his team to say, 'If you don't work together, go get another job somewhere else.'"

DevSecOps manifesto

1. CIO-driven

2. Collaboration of unlike teams

3. Focus on risk, not security

Source: David Cearley, Gartner Inc.

Confronting the teams' "biases and preconceived notions" of how this work should be done will be one of the CIO's biggest challenges, Cearley said. "The CIO is asking them to rethink that." One suggestion? Rather than accepting separate reports on application development, operations and security, CIOs should reinforce the importance of collaboration by demanding a "unified approach for how we're going to be able to develop, secure, operate and manage the services we're delivering to our users," he said.

Cearley also recommended that CIOs direct the conversation away from security toward risk, which can help IT better integrate the business perspective into the process. "If you start with security, the focus becomes what tools are needed to get the ultimate security. I'm sorry, but that's the wrong focus," Cearley said. "You have to start with risk." By keeping thefocus on risk, CIOs will help the business understand how IT can contribute to breaking into a new market or experimenting with a new type of analytics -- as well as how IT can minimize the potential dangers of doing so.

Let us know what you think of the story; email Nicole Laskowski, senior news writer, or find her on Twitter @TT_Nicole.

Link: http://searchcio.techtarget.com

 

Picture of System Administrator

Difusión (KW)

by System Administrator - Thursday, 2 May 2013, 5:13 PM
 

Antes de introducirnos en los detalles de este tema, es importante que cualquier técnica de difusión no presente al principio productos específicos, siglas u otros aspectos técnicos que no contribuyan a la visión práctica del asunto. En el momento de la presentación a los potenciales Usuarios, se debe hablar de los beneficios generales. Luego, para cada sector, adaptar el tipo y profundidad de los mismos al lenguaje e "idiosincracia" de cada uno. 

Cada producto o servicio “KW Compatible” tiene su propia forma de entrega y distribución. Éstas dependen de:

1. Sector al que apuntan.

2. Nivel de los HKW/DKW involucrados.

3. Tipo de producto o servicio, que en su aspecto más simple pueden ser de tres tipos: 

a. Contenidos Reusables para todas las plataformas tecnológicas (redes sociales, "cloud computing", web 3.0, "virtualización", normativas, direct target, logística, distribución, existencias, indicadores de calidad, bibliografía, catálogos, reglas de negocio, autorizaciones, historias clínicas, investigación, etc).

b. Transacciones KIP (a través de un servidor KIP). “¿Cuántos KIPS ha recibido de sus distribuidores este mes?”.

c. Hardware (telefonía móvil, automatismos industriales, handhelds, hogar digital).

En general fabricantes y distribuidores se manejan sobre la demanda de sus clientes. Saben que tomar la iniciativa es caminar sobre “el filo de la navaja”. Por eso es tan importante generar primero la aceptación en el ambiente científico y universitario, pues aquí se podrán generar componentes funcionales reales y salir airoso de la gran pregunta: “¿esto realmente funciona?”.

En estos temas la Comunidad Europea lleva importantes iniciativas, como el eEurope. En cada país de dicha Comunidad, el Estado ofrece subsidios importantes para este tipo de iniciativas. 

El argumento “liderazgo” es tomado con suma cautela por los potenciales inversores en este tipo de tecnología. La opción KW oficia como incubadora de emprendimientos sectoriales.

Otro argumento importante del que también hablamos es “no competir con los desarrolladores de software”. Es imprescindible presentar la integración KW a los desarrollos de estas empresas como un verdadero “seguro de vida”. Con la compatibilidad KW tendrán una curva de obsolescencia menor y la amenaza del código abierto será relativa. Recordemos que el proyecto GNU/Linux impulsa la apertura y reusabilidad de los componentes de programación y no de contenidos creados por aquel Usuario Final que no sabe o no le interesa programar.

Cada uno de los grandes componentes del proyecto tiene su propia estrategia de difusión. 

Picture of System Administrator

Digital Marketing Plan

by System Administrator - Thursday, 17 September 2015, 7:00 PM
 

 

por Juan Carlos Muñoz | Marketing Manager, Interactive & CRM at Volvo Car España | Profesor de ICEMD

Picture of System Administrator

Distributed Computing

by System Administrator - Monday, 10 August 2015, 10:13 PM
 

Distributed Computing

Posted by: Margaret Rouse

Distributed computing is a model in which components of a software system are shared among multiple computers to improve efficiency and performance. 

According to the narrowest of definitions, distributed computing is limited to programs with components shared among computers within a limited geographic area. Broader definitions include shared tasks as well as program components. In the broadest sense of the term, distributed computing just means that something is shared among multiple systems which may also be in different locations. 

In the enterprise, distributed computing has often meant putting various steps in business processes at the most efficient places in a network of computers. For example, in the typical distribution using the 3-tier model, user interface processing is performed in the PC at the user's location, business processing is done in a remote computer, and database access and processing is conducted in another computer that provides centralized access for many business processes. Typically, this kind of distributed computing uses the client/server communications model.

The Distributed Computing Environment (DCE) is a widely-used industry standard that supports this kind of distributed computing. On the Internet, third-party service providers now offer some generalized services that fit into this model.

Grid computing is a computing model involving a distributed architecture of large numbers of computers connected to solve a complex problem. In the grid computing model, servers or personal computers run independent tasks and are loosely linked by the Internet or low-speed networks. Individual participants may allow some of their computer's processing time to be put at the service of a large problem. The largest grid computing project is SETI@home, in which individual computer owners volunteer some of their multitasking processing cycles (while concurrently still using their computer) to the Search for Extraterrestrial Intelligence (SETI) project. This computer-intensive problem uses thousands of PCs to download and search radio telescope data.

There is a great deal of disagreement over the difference between distributed computing and grid computing. According to some, grid computing is just one type of distributed computing. The SETI project, for example, characterizes the model it’s based on as distributed computing. Similarly, cloud computing, which simply involves hosted services made available to users from a remote location, may be considered a type of distributed computing, depending on who you ask.

One of the first uses of grid computing was the breaking of a cryptographic code by a group that is now known as distributed.net. That group also describes its model as distributed computing.

Related Terms

Definitions

Glossaries

  • Software applications

    - Terms related to software applications, including definitions about software programs for vertical industries and words and phrases about software development, use and management.

  • Internet applications

    - This WhatIs.com glossary contains terms related to Internet applications, including definitions about Software as a Service (SaaS) delivery models and words and phrases about web sites, e-commerce ...

Dig Deeper

Continue Reading About distributed computing

Link: http://whatis.techtarget.com

Picture of System Administrator

DKW© (KW)

by System Administrator - Thursday, 2 May 2013, 5:14 PM
 

Componente DKW©

Definición 2003

DKW

1. Introducción - Contenidos Tradicionales

En general, hablamos de “contenido” cuando nos referimos al desarrollo de una temática específica, presentado en algún medio o soporte para su difusión.

Una biblioteca bien organizada (con el software MicroIsis, por ejemplo), posee todos sus volúmenes accesibles por varios índices, tales como ISBN, Título, Autores, Editorial, País, Idioma, Año de Publicación, Palabras Clave (descriptores o keywords agrupados en estándares), etc.

Más aún, si tuviera gran parte de los volúmenes digitalizados y hasta un buscador contextual o por palabras clave, igual hablaríamos de “contenidos tradicionales”. Esto es así porque es el propio Usuario Final quien debe generar las relaciones entre estos contenidos, al igual que el contexto de su aplicabilidad. Para un médico, por ejemplo, puede ser la diferencia en la calidad de su diagnóstico, tratamiento o prescripción. Para un abogado puede ser la diferencia entre ganar o perder un caso. Cada actividad obtendrá gran asertividad si posee estas herramientas.

Volvamos un momento al ejemplo del material de una biblioteca. Con las debidas autorizaciones, el material impreso puede ser fotocopiado, mientras que el digital puede ser copiado e insertado en un documento, o introducido en una base de datos multimedia. Estos dos últimos casos constituyen la forma actual de reutilizar un contenido en forma relativamente automatizada. La “inteligencia” del contenido final generado depende del tiempo y capacidad del Usuario Final, pues es éste el que hace el "nexo" de todo.

La web está inundada de contenidos en forma desordenada y anárquica. Los “portales temáticos” ofrecen el "saber humano" en diferentes formatos (XML, texto ASCII, ODF, PHP, HTML, RTF, PDF, DOC, PPS/PPT, XLS, etc.), a veces clasificados por palabras clave que ni siquiera siguen normas bibliográficas estándar o codificaciones internacionales.

Catálogo Descriptores

Catálogo Descriptores

Ejemplo de estándar internacional de codificación CIE10 para medicina (fuente: TNG Consultores)

Por ejemplo, si un Ginecólogo desea buscar material bibliográfico sobre “anticonceptivos para el día después”, podrá utilizar bibliotecas médicas poderosas como MedLine, ingresar al portal de un Laboratorio Farmacéutico determinado o simplemente hacer uso de un motor de búsqueda tal como los proporcionados por Google, Yahoo o Altavista. ¿Cuáles son los posibles problemas al recuperar esta información?:

1. Le llegan cientos o miles de links hacia información de calidad buena, mala o regular. Esto fastidia bastante al Usuario Final pues comprueba una y otra vez que la web es poderosa, enorme… pero a menudo impráctica. Uno de los objetivos de este proyecto (componentes DKW) es que “el web se dirija al Usuario y no al revés, simplemente con un clic u orden verbal” (Fase 2).

2. El formato de la información obtenida podrá incluir texto, imágenes y hasta audio y video, pero debe recuperarla y clasificarla manualmente por tipo de objeto.

3. Si su objetivo es simplemente el de escribir una tesis, seleccionará los contenidos que crea interesantes.

Pero, ¿y el resto del material acopiado?.

 4. Partamos de la base de que el Usuario ha grabado las páginas que le interesaban en su disco. ¿Cómo las recuperará fácilmente? Si el Usuario es de conocimiento medio (como la inmensa mayoría de los Usuarios autodidactas), habrá grabado las páginas en carpetas temáticas. Pero si se trata de un Usuario avanzado, utilizará una base de datos documental multimedia que, a través de links, generará catálogos automáticos para una rápida recuperación por palabras clave (descriptores, keywords), autores, títulos, fragmentos de texto, etc.

Hasta aquí hemos hablado de “Contenidos Tradicionales” en un sentido compatible con bibliografía.

2. Definición

Qué es un DKW?

DKW

En la figura anterior observamos el concepto de cómo se puede generar un componente DKW (Data Knowledge Component). Necesitamos un "recipiente contenedor" y datos. 

Definición

DKW

El lenguaje XML nos permite transmitir "formatos" y "datos" con semántica para su interpretación y validación (CDA en medicina es uno de tantos ejemplos DTD) y ya existen bases de datos nativas XML. La tecnología está lista para los DKW.

Un "cluster DKW" es una sistema de almacenamiento de componentes DKW temáticos, controlado por un servidor KIP. Puede transmitir, además de datos concretos, el conocimiento HKW/SKW involucrado, o una referencia/link a dónde se encuentra.

Red KW

En una receta de cocina para el hogar digital, el componente DKW de dicha receta debe especificar ingredientes, forma de preparación y parámetros para los "drivers" de los electrodomésticos que serán utilizados. Todos estos datos son ejecutados a través del servidor KIP hogareño.

Cualquier tipo de comunidad (abierta o cerrada) podrá crear y almacenar sus componentes DKW en clusters controlados por servidores KIP.

3. Construcción

Los pasos para la construcción de componentes DKW pueden ser los siguientes: 

1. Elección del "conocimiento" a utilizar. Esto es cargar en el framework del Usuario Final el componente HKW/SKW que se corresponde con los datos a cargar.

2. "Alineación". Esto significa utilizar los estándares de codificación sectorial y/o los datos en el formato adecuado. Algunos de estos datos serán OIDs (identificadores universales de objetos).

3. Especificar si el DKW integrará el "conocimiento" HKW/SKW o simplemente se incluirá la referencia/link al mismo.

4. Grabar el DKW. Este DKW puede quedar en la base de datos local o puede cargarse en el cluster del KIP comunitario.

5. Opcionalmente, podrá ejecutar una "sinapsis" para recuperar y analizar conocimiento relacionado con el DKW recién creado. Esto puede generar uno o varios componentes SKW, que podrán tener sentido o no.

Existen hoy en el mercado muchas empresas altamente especializadas en generar catálogos temáticos, sectoriales y multimedia general para los medios de comunicación. Estas empresas poseen lo que da en llamarse el "Departamento de Contenidos", que cumplen en general dos funciones básicas:

A. Crean los contenidos (futuros componentes DKW) de los productos y servicios ofrecidos en un formato aceptable por el mercado.

B. Colaboran en la "alineación" de estos contenidos con los estándares sugeridos por la cadena de valor a la que pertenecen. Esto es imprescindible para lograr la "conexión" y el siguiente paso: interoperabilidad.

Comunidades, redes sociales, empresas o profesionales independientes pueden necesitar estos servicios para conectarse con su cadena de valor. Actualmente estos servicios de construcción de contenidos y alineación facturan mucho más que el tráfico de red utilizado.

La tecnología DKW puede cambiar totalmente este escenario.

4. Estrategia de Difusión

DKW y los Departamentos de Contenidos: Un componente DKW empaqueta datos puntuales y, opcionalmente, la lógica HKW y/o SKW relacionada (por ejemplo, un texto XML con datos y semántica para su interpretación y validación). Un DKW es, pues, datos e información sobre productos, servicios, personas, procesos, logística, bibliografía, noticias, redes sociales y/o comunidad.

  • IMPORTANTE: La tecnología DKW no contradice ni intenta cambiar los estándares sectoriales, internacionales y/o desarrollados por el propio Usuario (esto significa que el formato de los datos dentro del cuerpo de un paquete DKW puede seguir cualquier normativa). 
  • El rol del Dpto. de Contenidos es tanto o más importante que la tecnología involucrada. Es necesario plantear la facilidad de conversión para los datos e información ya existentes sobre productos y servicios del Usuario y que, por otra parte, han requerido un enorme coste a lo largo del tiempo. En todo momento se debe respetar la inversión en IT que ya ha desarrollado el Usuario. 
  • La tecnología HKW/SKW/DKW/KIP incrementa la productividad de lo que el Usuario ya tiene en su empresa o comunidad, significando un gran cambio en el rendimiento de la infraestructura informática que utiliza. Es ponerlo al día con lo que él siempre percibió como el ruido de la “era de la información”. 
  • Los diferentes Dptos. de Contenidos Sectoriales desarrollarán campañas concretas para los Usuarios, colaborando en la creación de componentes HKW y DKW para ellos. Esta política debe ser aplicada a aquellos Usuarios considerados “visagra” (redes sociales, laboratorios farmacéuticos, grandes superficies comerciales, telefónicas y fabricantes con grandes redes comerciales). Si un gran comprador integra la tecnología KW, toda su cadena de valor se volcará también.  
  • Una importante estrategia consiste en crear, en el seno de la “Fundación KW” local, comités sectoriales para “democratizar” el aporte de cada actor (individuo en una red social, fabricante, importador, distribuidor, comerciante, desarrollador de software, representante del estándar sectorial, delegados de los bancos, representante de consumidores y del Estado). Este comité “validará” la tecnología en forma natural y eficiente. De esta forma hay una congruencia entre la “filosofía KW” y las acciones de implantación de la misma (los clusters de conocimiento resultantes tendrán una doble certificación, la de “KW Compatible” y la del comité del sector de actuación). 

Los actuales proveedores de contenidos del mercado podrán distribuir sus productos de una forma más granular y eficiente. El mercado que se les abre es más grande y ambicioso que el que tienen actualmente. La proactividad de sus componentes DKW con las aplicaciones informáticas les genera un nuevo modelo/unidad de negocios y, por sobre todas las cosas, podrán evitar la piratería con medios mucho más fiables que los actuales.

Picture of System Administrator

DNA Machines (DNA)

by System Administrator - Monday, 1 July 2013, 12:53 PM
 

DNA Machines Inch Forward

Researchers are using DNA to compute, power, and sense.

By Sabrina Richards | March 5, 2013

Advances in nanotechnology are paving the way for a variety of “intelligent” nano-devices, from those that seek out and kill cancer cells to microscopic robots that build designer drugs. In the push to create such nano-sized devices, researchers have come to rely on DNA. With just a few bases, DNA may not have the complexity of amino acid-based proteins, but some scientists find this minimalism appealing.

“The rules that govern DNA’s interactions are simple and easy to control,” explained Andrew Turberfield, a nanoscientist at the University of Oxford. “A pairs with T, and C pairs with G, and that’s basically it.” The limited options make DNA-based nanomachines more straightforward to design than protein-based alternatives, he noted, yet they could serve many of the same functions. Indeed, the last decade has seen the development of a dizzying array of DNA-based nanomachines, including DNA walkers, computers, and biosensors.

Furthermore, like protein-based machines, the new technologies rely on the same building blocks that cells use. As such, DNA machines “piggyback on natural cellular processes and work happily with the cell,” said Timothy Lu, a synthetic biologist at the Massachusetts Institute of Technology (MIT), allowing nanoscientists to “think about addressing issues related to human disease.”

Walk the line

One of the major advancements of DNA nanotechnology is the development of DNA nanomotors—miniscule devices that can move on their own. Such autonomously moving devices could potentially be programmed to carry drugs directly to target tissues, or serve as tiny factories by building products like designer drugs or even other nanomachines.

DNA-based nanomachines rely on single-stranded DNA’s natural tendency to bind strands with complementary sequences, setting up tracks of DNA to serve as toeholds for the single-stranded feet of DNA walkers. In 2009, Nadrian Seeman’s team at New York University built a tiny DNA walker comprised of two legs that moved like an inch worm along a 49-nanometer-long DNA path. 

But to direct drugs or assemble useful products, researchers need DNA nanomachines to do more than move blindly forward. In 2010, Seeman created a DNA walker that served as a “nanoscale assembly line” to construct different products. In this system, a six-armed DNA walker shaped like a starfish somersaulted along a DNA track, passing three DNA way stations that each provided a different type of gold particle. The researchers could change the cargo stations conformations to bring the gold particles within the robot’s reach, allowing them to get picked up, or to move them farther away so that the robot would simply pass them by.

“It’s analogous to the chassis of a car going down an assembly line,” explained Seeman. The walker “could pick up nothing, any one of three different cargos, two of three different, or all three cargos,” he said—a total of 8 different products.

And last year, Oxford’s Turberfield added another capability to the DNA walker tool box: navigating divergent paths. Turberfield and his colleagues created a DNA nanomotor that could be programmed to choose one of four destinations via a branching DNA track. The track itself could be programmed to guide the nanomotor, and in the most sophisticated version of the system, Turberfield’s nanomachine carried its own path-determining instructions.

Next up, Turberfield hopes to make the process “faster and simpler” so that the nanomotor can be harnessed to build a biomolecule. “The idea we’re pursuing is as it takes a step, it couples that step to a chemical reaction,” he explained. This would enable a DNA nanomotor to string together a polymer, perhaps as a method to “build” drugs for medical purposes, he added.

DNA-based biosensing

DNA’s flexibility and simplicity has also been harnessed to create an easily regenerated biosensor. Chemist Weihong Tan at the University of Florida realized that DNA could be used to create a sensor capable of easily switching from its “on” state back to its “off” state. As proof of principle, Tan and his team designed biosensor switches by attaching dye-conjugated silver beads to DNA strands and studding the strands onto a gold surface. In the “off” state, the switches are pushed upright by extra DNA strands that fold around them, holding the silver beads away from the gold surface. These extra “off”-holding strands are designed to bind to the target molecule—in this case ATP—such that adding the target to the system coaxes the supporting strands away from the DNA switches. This allows the switch to fold over, bringing the silver bead within a few nanometers of the gold surface and creating a “hotspot” for Raman spectroscopy —the switch’s “on” state.

Previous work on creating biosensors based on Raman spectroscopy, which measures the shift in energy from a laser beam after it’s scattered by individual molecules, created irreversible hotspots. But Tan can wash away the ATP and add more supporting strands to easily ready his sensor for another round of detection, making it a re-usable technology.

Though his sensor is in its early stages, Tan envisions designing biosensors for medical applications like cancer biomarker detection. By using detection strands that bind directly to a specific cancer biomarker, biosensors based on Tan’s strategy would be able to sensitively detect signs of cancer without need for prior labeling with radionuclides or fluorescent dyes, he noted.

Computing with DNA

Yet another potential use for DNA is in data storage and computing, and researchers have recently demonstrated the molecule’s ability to store and transmit information. Researchers at Harvard University recently packed an impressive density of information into DNA—more than 5 petabits (1,000 terabits) of data per cubic millimeter of DNA—and other scientists are hoping to take advantage of DNA’s ability to encode instructions for turning genes on and off to create entire DNA-based computers.

Although it’s unlikely that DNA-based computing will ever be as lightning fast as the silicon-based chips in our laptops and smartphones, DNA “allows us to bring computation to other realms where silicon-based computing will not perform,” said MIT’s Lu—such as living cells.

In his latest project, published last month (February 10) in Nature Biotechnology, Lu and his colleagues used Escherichia coli cells to design cell-based logic circuits that “remember” what functions they’ve performed by permanently altering DNA sequences. The system relies on DNA recombinases that can flip the direction of transcriptional promoters or terminators placed in front of a green fluorescent protein (GFP) gene. Flipping a backward-facing promoter can turn on GFP expression, for example, as can inverting a forward-facing terminator. In contrast, inverting a forward-facing promoter or a backward-facing terminator can block GFP expression. By using target sequences unique to two different DNA recombinases, Lu could control which promoters or terminators were flipped. By switching the number and direction of promoters and terminators, as well as changing which recombinase target sequences flanked each genetic element, Lu and his team induced the bacterial cells to perform basic logic functions, such as AND and OR.

Importantly, because the recombinases permanently alter the bacteria’s DNA sequence, the cells “remember” the logic functions they’ve completed—even after the inputs are long gone and 90 cell divisions have passed. Lu already envisions medical applications relying on such a system. For example, he speculated that bacterial cells could be programmed to signal the existence of tiny intestinal bleeds that may indicate intestinal cancer by expressing a dye in response to bloody stool. Such a diagnostic tool could be designed in the form of a probiotic pill, he said, replacing more invasive procedures.

Applications based on these studies are still years away from the bedside or the commercial market, but researchers are optimistic. “[It’s] increasingly possible to build more sophisticated things on a nanometer scale,” said Turberfield. “We’re at very early stages, but we’re feeling our way.”

Picture of System Administrator

DNA Storage (DNA)

by System Administrator - Wednesday, 26 June 2013, 10:06 PM
 

DNA storage is the process of encoding and decoding binary data onto and from synthesized strands of DNA (deoxyribonucleic acid). In nature, DNA molecules contain genetic blueprints for living cells and organisms.

To store a binary digital file as DNA, the individual bits(binary digits) are converted from 1 and 0 to the letters A, C, G, and T. These letters represent the four main compounds in DNA: adenine, cytosine, guanine, and thymine. The physical storage medium is a synthesized DNA molecule containing these four compounds in a sequence corresponding to the order of the bits in the digital file. To recover the data, the sequence A, C, G, and T representing the DNA molecule is decoded back into the original sequence of bits 1 and 0.

Researchers at the European Molecular Biology Laboratory (EMBL) have encoded audio, image, and text files into a synthesized DNA molecule about the size of a dust grain, and then successfully read the information from the DNA to recover the files, claiming 99.99 percent accuracy.

An obvious advantage of DNA storage, should it ever become practical for everyday use, would be its ability to store massive quantities of data in media having small physical volume. Dr. Sriram Kosuri, a scientist at Harvard, believes that all the digital information currently existing in the world could reside in four grams of synthesized DNA.

A less obvious, but perhaps more significant, advantage of DNA storage is its longevity. Because DNA molecules can survive for thousands of years, a digital archive encoded in this form could be recovered by people for many generations to come. This longevity might resolve the troubling prospect of our digital age being lost to history because of the relative impermanence of optical, magnetic, and electronic media.

The principal disadvantages of DNA storage for practical use today are its slow encoding speed and high cost. The speed issue limits the technology's promise for archiving purposes in the near term, although eventually the speed may improve to the point where DNA storage can function effectively for general backup applications and perhaps even primary storage. As for the cost, Dr. Nick Goldman of the EMBL suggests that by the mid-2020s, expenses could come down to the point where the technology becomes commercially viable on a large scale.

This was last updated in April 2013

Contributor(s): Stan Gibilisco

Posted by: Margaret Rouse
 
 

Page: (Previous)   1  2  3  4  5  6  7  8  9  10  ...  22  (Next)
  ALL