Microsoft researcher wins prestigious Turing Prize

Gigaom

Leslie Lamport, a principal researcher with Microsoft Research, has been awarded the A.M. Turing Award, the highest honor in the computing industry.

According to the Association for Computing Machinery:

Leslie Lamport originated many of the key concepts of distributed and concurrent computing, including causality and logical clocks, replicated state machines, and sequential consistency. Along with others, he invented the notion of Byzantine failure and algorithms for reaching agreement despite such failures.

Leslie LamportA graduate of the legendary Bronx High School of Science, Lamport went on to earn a B.S. and Ph.D. in Mathematics at MIT and Brandeis, respectively. In 1978, he wrote a paper — Time, Clocks and the ordering of Events in a Distributed System, that is viewed as the basis for much subsequent work in distributed systems.

Lamport, who works out of Microsoft Research’s Silicon Valley office, joined the company in 2001 after stints at SRI International and Digital…

View original post 13 more words

Advertisements

What is Big Data? and What does it look like?

If we want the definition of the ‘Big data’ its some sort of the data that can’t be processed by the conventional database systems. In fact, the data is big, moving very fast and doesn’t fit the structures of the database architectures. In order to get information from this ‘Big Data’ we need to choose such an option that could process it. Recent breakthrough in IT ‘Big data’ is something like new tech. A normal people  thinks of it simply as the data that is big. Well, we cannot say it is wrong. As it is such data that requires the processing to get some information with it.

We all know that, today in this world the data has become a viable. As the data is simply massive, thus many cost effective approach has emerged out to tame the volume variability and velocity. And the question remains, What is the value of Big Data in an organization? The answer is simple, The value of it falls on the 2 categories. One is analytical use and other is enabling new products.

Talking about the past decades, all the successful web startups are the examples of the big data. The emergence of big data into the enterprise brings with it a necessary counterpart: agility. Successfully exploiting the value in big data requires experimentation and exploration. Whether creating new products or looking for ways to gain competitive advantage, the job calls for curiosity and an entrepreneurial outlook.

As a catch-all term, “big data” can be pretty nebulous, in the same way that the term “cloud” covers diverse technologies. Input data to big data systems could be from social networks, web server, traffic flow sensors, satellite imagery, audio and video  streams, banking transactions, MP3s of any music, contents of web pages, scans of any documents, GPS trails, telemetry from the automobiles, financial market data, and much more which leads the list to go on. So are these all the same thing? To clarify this issue, the three Vs of volume, velocity, and variety are commonly used to characterize different aspects of big data. They’re a helpful lens through which to view and understand the nature of the data and the software platforms available to exploit them. Most probably you will contend with each of the Vs to one degree or another.