CloudOn 4.0 Brings Microsoft Office To Android Phones

CloudOn, the popular free mobile productivity app that gives its users access to Microsoft Office on their smartphones and tablets, was only available on iOS and for Android tablets like the Nexus 7 until now. Starting today, however, Android smartphone users, too, will be able to use the company’s service to create, review, edit and share their Office documents from devices like the Galaxy Nexus 4 and Galaxy Note.
CloudOn tells us that it is currently seeing an average of 540k new downloads per month. So far, the company says, December 20 was its best day ever with over 90k.
In addition, CloudOn also updated its FileSpace – a kind of news feed for all the activity that happens around a given file. This feature brings together all the annotations, edits and notes other may have added to a document and which, as CloudOn argues, usually get lost as files get shared back and forth via email.
cloudon_android_phone-1 For the Android smartphone version, CloudOn also introduced a revamped version of the Microsoft Office ribbon that makes it easier to use on a touch-enabled device by spacing icons out a bit and making them larger so that “functions like selecting font size, turning on track changes or creating a table are dead simple for users across all editing options.”
Just like the iOS and Android tablet version, Android smartphone users will also be able to view their documents in landscape mode, add notes and comment to any file and access their files on Dropbox, SkyDrive, Google Drive and other cloud-storage services right from the app (CloudOn doesn’t want to be in the strorage business, so they fully rely on third-party services for that).
“We are thrilled that CloudOn is now available across the entire Android ecosystem, and are excited to be able to reach more of our loyal customer base,” said CloudOn CEO Milind Gadekar in a canned statement today. “In addition to bringing Microsoft Office to Android phones, we are excited to give our users a better way to create content, and also access the critical contextual information around documents as we continue to grow our productivity offering.”



ccloudon_android_phone_2
Original Article : http://techcrunch.com/2013/02/21/cloudon-4-0-brings-microsoft-office-to-android-phones/

Google Launches $1,299 Chromebook Pixel With 2560×1700 3:2 12.85″ Touchscreen, Core i5 CPU, 1TB Of Google Drive Storage & Optional LTE

After a few weeks of rumors, Google just announced the latest device in its Chromebook lineup: the Chromebook Pixel. Unlike previous Chromebook versions, the Pixel is aimed at power users who fully live in the cloud. The device features an impressive array of hardware specs. It has a 12.85 inch high-density 2560×1700 screen (that’s 4.3 million pixels) with a 3:2 aspect ratio, an Intel Core i5 processor and a whopping 1 terabyte of free storage on Google Drive for three years.
Google will also soon launch a version with a built-in LTE radio and has partnered with Verizon to offer 100 MB/month for two years of mobile broadband and with GoGo to offer 12 free in-flight Wi-Fi sessions.
The Pixel’s screen, which is obviously the highlight of the device, features a pixel density of 239 pixels per inch. That’s a bit higher than the 220 pixels/inch on the Macbook Pro with a Retina display, so Google proudly notes that its laptop “has the highest pixel density of any laptop display.”
chromebook_pixel_specs
The basic Wi-Fi version of the Pixel will retail for $1,299 in the U.S. and £1,049 in the U.K. The Pixel is now available on Google Play and will also be available at select Best Buy locations in the U.S. and Currys PC World in the U.K. tomorrow. The LTE version ($1,449) will ship in the U.S. in April. The other difference between the LTE and Wi-Fi models is that the LTE version will ship with a 64GB solid-state drive and the Wi-Fi version will only have 32GB.
Google did not disclose who its hardware partners are, but the company did say that the device is being assembled in Taiwan.
“I think the hardware shines,” Google VP Sundar Pichai said at a press event in San Francisco today. Google, Pichai stressed, wanted to build a device for power users who live in the cloud. “There’s a set of users who are really committed to living completely in the cloud,” he said, and Google wanted to build the perfect laptop for them.
P1110533
The first thing users will definitely notice when they first open the Pixel is the screen. Not only does it have a very high resolution, but it also features a relatively unusual aspect ration of 3:2. According to Pichai, the reason for this was that Google looked at what people would do with this device, and given that the web still focuses on content that is meant to be displayed horizontally, the design team decided to discard the idea of a screen with the more typical 16:10 resolution and went with 3:2.
The screen, Google says, includes a 0.55mm layer of touch-enabled Gorilla Glass fused directly to the screen. Google says this screen “gives you smooth interactions while preserving picture clarity” and after some hands-on time with the device, Google definitely isn’t exaggerating the quality of the screen, which definitely measures up to Apple’s Retina displays.
Google also stressed that this is a very premium device (something that’s obviously reflected in the price). Pichai, for example, noted that the piano hinge has the feel of a “very premium car door” and the team added rounded corners to the aluminum body to make it feel better when you hold it. Google also stressed that it redesigned numerous components and often had to resort to designing its own parts to meet its specs. The team, for example, added a third microphone to the device so it not only cancels out background noise, but also the noise you make yourself when you type on the keyboard (the Pixel has a 720p webcam for Google Hangouts and other video chats, too).
chromebook_pixel
Despite the premium price and components, Sundai stressed that the overall philosophy behind the Chromebook project hasn’t changed. The Pixel, however, is meant for power users. “We also wanted to design something very premium for power users – people who spend money on their laptops,” he said at today’s presentation. The idea behind Chrome, Google says, “has always been to minimize the ‘chrome’ of the browser. In much the same way, the goal of the Pixel is to make the pixels disappear, giving people the best web experience.”
Chrome itself, of course, has also been optimized for touch, which Pichai believes will soon be on every laptop. The menus are now larger and easier to click on with your fingers.
Asked about how the Pixel compares to the Macbook Air, Pichai noted that the Pixel has a higher resolution and a touch screen, something Apple doesn’t currently offer – especially on a 12-inch device.
The price, of course, definitely puts the Pixel in a premium category and it remains to be seen how the market will react to it. It is, no doubt, the best Chromebook on the market today and the hardware, including the fit and finish of the device, is very impressive. At $1,299 for the basic version, though, some potential buyers may decide to opt for a premium Apple laptop or Ultrabook instead.
Here are the full hardware specs:
Pixel_back_angle_whiteINPUTS
Gorilla® Glass multi-touch screen
Backlit Chrome keyboard
Fully clickable, etched-glass touchpad
HD Webcam
 
SCREEN
12.85″ display with a 3:2 aspect ratio
2560 x 1700, at 239 PPI
400 nit screen
178° extra-wide viewing angle
 
DIMENSIONS
297.7 x 224.6 x 16.2 mm
3.35lb / 1.52kg
 
PORTS
2 x USB 2.0
mini-display port
2-in-1 card reader supporting: SD, MMC

INNARDS
Intel® Core™ i5 processor (Dual Core 1.8GHz)
Intel® HD Graphics 4000 (Integrated)
4 GB DDR3 RAM
32 GB solid state drive (64 GB for LTE model)
AUDIO
Headphone/microphone jack
Built-in microphone array
Integrated DSP for noise cancellation
Powerful speakers tuned for clarity
INDUSTRIAL DESIGN
Active cooling with no visible vents
Machined from anodized aluminum
ENERGY STAR® certified
 
BATTERY
Up to 5 hours of active use (59 Wh battery)
NETWORK
Dual-band WiFi 802.11 a/b/g/n 2×2
Bluetooth® 3.0
GOODIES
One terabyte of Google Drive cloud storage, free for 3 years
12 free sessions of GoGo® Inflight Internet
100 MB/month for 2 years of mobile broadband from Verizon Wireless (LTE model). Carrier terms and conditions apply.
 
Original Source : http://techcrunch.com/2013/02/21/google-announces-1299-chromebook-pixel-with-2560x1700-32-12-85-touchscreen-core-i5-cpu-1tb-of-google-drive-storage-optional-lte/

Eight Answers About Predictive Analytics

With my book on the topic releasing this week, here's an interview in which I answer eight questions about predictive analytics:
1. What is predictive analytics?
2. Why is predictive analytics important?
3. Isn't prediction impossible?
4. Is predictive analytics a big data thing?
5. Did Nate Silver use predictive analytics to forecast Obama's elections?
6. Does predictive analytics invade privacy?
7. What are the hottest trends in predictive analytics?
8. What is the coolest thing predictive analytics has done?


Big Data: A Natural Solution for Disaster Relief

By Mike Smitheman, VP of Marketing at GoodData

Last Friday, a 10,000-ton meteor sped into northern Russia at 40,000 miles per hour. The resulting shock wave hurt an estimated 1,000 people. While scientists were concerned about another, more massive meteor called DA14, the Russian one seemingly slipped under the radar, catching experts off-guard.
With big data as common in science as it is everywhere else, could we have used better tools to see this coming? What’s the role of big data in natural disasters today?
big data crisis managementThe answer is a work in progress. NASA, for one, admits to currently having a big data problem. “(D)ata is continually streaming from spacecraft on Earth and in space, faster than we can store, manage, and interpret it,” writes NASA Project Manager Nick Skytland. “In our current missions, data is transferred with radio frequency, which is relatively slow. In the future, NASA will employ technology such as optical (laser) communication to increase the download and mean a 1000x increase in the volume of data. This is much more then we can handle today and this is what we are starting to prepare for now. We are planning missions today that will easily stream more then 24TB’s a day. That’s roughly 2.4 times the entire Library of Congress – EVERY DAY. For one mission.”
NASA still needs to catch up with its data load. Other government agencies are looking for ways to collaborate more effectively. For example, the Department of Defense has secret satellites located around the world for reconnaissance. Those satellites also happen to have the capability to detect large and small meteors. The DoD, however, is nervous about sharing any information that it deems classified, so efforts are still underway to find a way to incorporate that data into the bigger scientific schema.
Real-Time Disaster Maps
Terrestrial challenges, on the other hand, are currently more amenable to big data. One of big data’s true strengths lies in crisis mapping, the process of using visualizations, footage, analysis and apps to get an overview of a disaster as it evolves. Google’s Superstorm Sandy Crisis Map tracked the course of last winter’s storm, with video footage, evacuation routes and emergency aid centers. The UN commissioned the Digital Humanitarian Network to track the real-time effects of Typhoon Pablo in the Philippines. Among other efforts, social data was analyzed to provide a detailed, real-time map of displaced people, fatalities, crop damage, broken bridges and more.
Relief Efforts
If done well, relief coordination means the difference between life and death. Big data has made it possible for relief organizations to reach their goals more quickly and effectively.
To name a few examples: Data analytics company Palantir worked with relief agencies to provide rescuers with data in advance of their arrival, such as a map of fallen trees, power outages and gas shortages. NASA and the Open Science Data Group are working on Project Matsu, which uses Hadoop to analyze and timestamp satellite pictures to give relief workers real-time disaster maps. Google Person Finder re-connects disaster victims with their friends and families.
A Nascent Science
Big data’s contributions to crisis mapping and relief efforts have already made a big difference. Yet they’re just a drop in the bucket compared to the technologies that will evolve in the coming decade. We may have push alerts sent to our mobile phones, warning of natural disasters either in our location or those of our loved ones. Rather than swarming the nearest location, disaster victims may be routed to the relief centers that can best accommodate them. With the right tools and communal efforts, we may even be able to predict the trajectories of oncoming meteorites.
 
Original Article : http://smartdatacollective.com/kathryn1723/106266/big-data-natural-solution-disaster-relief

A Data Scientist Investigates the Belgian Municipal Elections

After the provincial and municipal elections of the 14th of October in Belgium, media reported several cases of candidates who had received more preference votes than what normally would have been expected. The additional votes were attributed to a problem with the touch screens of the voting machines. When voters pressed too long when selecting a party, the system would sometimes register a preference vote for the candidate whose name appeared in the same area as where the party was. The figures below illustrates well what the issue is.
touch screen effect in elections  data science
  
The figure on the left is the Parties Screen, i.e. the first screen the voter sees. On that screen the voter selects a party (sometimes called a list). In this case, the voter has selected the PVDA+ party, as indicated by the blue rectangle. The figure on the right hand side is the candidates screen, i.e. the subsequent screen the voter sees. This screen shows all the candidates for the party that was selected in the previous sceen.
It was reported that in some cases, especially when the voter pushed too hard or too long, the second screen would register a preference vote for the candidate at the same location on the screen as where the party name was (also indicated with a blue rectangle). For simplicity's sake we will call the position that corresponds with the exact location where the party name was a hotspot.
While some individual cases of remarkbable results were widely reported in the newspapers, a more thorough investigation of the problem was, as far as I know lacking. This effectively means that the magnitude of the effect was unknown.
As a data scientist that's the kind of question that you would find interesting; you would wonder how big the "Touch Screen Effect" was. That's exactly what I did in a paper you can download here. In this blog post I'll try to summarize what I did, focussing on the Data Science aspects.
The first problem is to get the data. While a lot of (news-)websites offered web applications to retrieve the results of the elections in a meaningful way, none of them allowed me to download the complete set of results, i.e. for each candidate, on each list (or party), in every municipality of the Flemish part of Belgium, the number of preference votes was needed.
So I wrote a Python script to scrape the data from one of those websites. I used the BeautifulSoup and Selenium packages for that. If the hypothesis of a "Touch Screen Effect" is true we would expect a higher number of votes on hotspots than on the other positions. So the second problem becomes: "What is the expected number of votes for a given candidate on a given position, on a given list (or party) in a given municipality?"
For reasons that are detailed in the paper we chose to model the natural logarithm of the share of vote of a candidate (relative to his or her party in his or her municipality) with a multilevel regression model based on a polynomial of the third degree of the position of the candidate on the list. The polynomial is needed to capture the curvilinear nature of the data. Typically the candidates on top of a list, and the candidates at the bottom of the list receive more votes than those in between. The computations were done in R, using the LME4 package. Those who are interested can find the formulas in the original paper. The figure below illustrates this curvilinear nature for a random selection of 9 party-municipality combinations. The red line comes from a local regression model, the green line is the multilevel model that was used in the final analysis.
data scientist tests election results     

So far, so good. The third problem is to decide on what constitutes an exceptional share of vote. Here we use the standardized residuals, as calculated by the LMERConvenienceFunctions in R (this is because in mixed models the calculation of standardized residuals is not that straightforward). Values above 2 are considered as being outliers.
With these three problems solved we can show the results for the main parties in the city of Antwerp (see below). The black points are the share of votes of the candidates (expressed as percentages) in function of the position of the candidate. The red points are those that are considered to be outliers, using the criterion descussed above. The green line is is the multilevel regression line. Hotspots are indicated by two blue vertical lines. Notice that in this figure the first 4 positions were omitted (for a justification, see the original paper).
     
In the figure above we see that all parties, except Open Vld, have an outlier that happens to be on a hotspot. There are a few parties that have more than one hotspot, while only one of them is actually an outlier. And finally, there are some outliers that are not on a hotspot. The latter observation should not come as a surprise. Exceptional election results can be attributed to lots of things, such as running a good campaign, being famous, and so on. In other words, the fourth problem is that we need to take into account that outliers could also just accidentally be on a hotspot.
The approach I'm using is based on the calculation of some simple conditional probabilities. The probability for a hotspot in Antwerp to be an outlier (i.e. have substantially more votes than expected based on the position on the list) is about 55%. The probability for a normal (non-hotspot) position to have an outlier is 7%. The ratio between the two is 7.5. This means that the probability to have an exceptionally high share of votes (i.e. being an outlier) is 7.5 times higher for a hotspot than it is for a normal position.
It must be said that not all municipalities have the same spectacular effects as in Antwerp. On the other hand we should also stress that estimates are probably conservative. One of the reasons that this is the case is that the regression model is generally good, except for the first and last position. This effectively means that there are quite a lot of outliers on those two positions, while those positions are not relevant from the perspective of the Touch Screen issue. We can calculate less conservative estimates by disregarding the first two positions and the last two positions. Also, my approach doesn't work well for smaller municipalities because the lower number of observations. And hence we limit ourselves to the 12 most important Flemish cities where electronic voting was used. In those cities the overall, conservative, estimate is almost 2, while the less conservative estimate is 4.5.
I repeat the interpretation here: In the most important Flemish cities that used electronic voting, the probability to have a (much) higher votershare than expected on a hotspot is almost twice as high as it is for a normal position. If we use the less conservative estimate it becomes 4.5.
Some parties are more affected than others. The right-wing Vlaams Belang party seems to be more affected than others. If we again limit the analysis to the 12 most important Flemish cities that used electronic voting we see that the probability that a hotspot is an outlier is 40%. This is 10 times higher than it is for other positions for this party. If we use the less conservative estimate the ratio becomes more than 80.
The heatmap below illustrates the situation for Vlaams Belang in the main Flemish cities well. The columns are the twelve major cities in Flanders that used electronic voting. The rows are the positions on the candidates-list. The blocks with a blue border are the hotspots. Notice that the total number of candidates on the list can vary over the cities. The colouring of the heatmap is a function of the squared residuals of the multilevel-model. We thus take a less binary approach as we did above when we classified all cases as either being outliers or not. The dark areas (red and orange) are the positions where the voter share was close to the expectation. Lighter colours (yellow and white) indicate positions that deviated from the expectation (higher or lower).
The first thing we notice in the heatmap below is that the top position and the last position are generally not well estimated by the regression model. That's not ideal, but it is less relevant from a "Touch Screen Effect" point of view (none of the top or bottom positions are on a hotspot). The important thing here is to notice that the remaining area between the top and the last position generally is dark (red or orange). There are some brighter spots, though. The brightest spots happen to be in the blue rectangles, i.e. hotspots, illustrating the effect of the position of the partyname (on the previous screen) on the preference votes themselves. It appears that besides Turnhout, the only cities that were not affected are those where the party name was split over two candidate columns (and hence two hotspots for Genk, Aalst en Roeselare).

Remark: for a figure of better quality see the original paper

Conclusions:
Based on these findings I'm confident that the "Touch Screen Effect" as reported by the newspapers right after the muncipal election in Flanders, Belgium in 2012, were not only anecdotal, but had a clear effect in Flanders's largest cities. The study remains inconclusive with regard to smaller municpalities. On aggregate there remains a noticeable effect over all municipalities were electronic voting was used. Some cities, such as Antwerp, and some parties, such as Vlaams Belang, seem to be more strongly affected than others.
From a data science perspective it is interesting to notice that in this reseach I had to combine 3 skills:
  • Hacking skills to assemble all required data (scraping of the election results from a news website with Python).
  • Statistical skills to model the data and detect outliers (multilevel regression models in R with the lme4 and LMERConvenienceFunctions packages).
  • Substantive expertise in the political situation in Belgium to understand how the election process works.
It is not by accident that these are also the three elements in The Data Science Venn Diagram of Drew Conway:


Original article

Ubuntu Touch Developer Preview Build For Nexus Devices Now Available

As promised the Canonical Team has released the Developer preview builds of Ubuntu Touch for the Galaxy Nexus, Nexus 4, Nexus 7 and the Nexus 10. The developer preview builds are exactly that, developer preview, and are nowhere near good enough to be used as a daily driver.
Before you jump the gun and flash the Ubuntu Developer Preview build on your Nexus device, make sure that your device is supported. At the moment, the CDMA variants of the Galaxy Nexus are not supported and so is the 3G variant of the Nexus 7. If you are running Ubuntu, you can follow the installation instructions written here. If you are on Windows or Mac and/or are looking for the Fastboot flashable images, head over to this link and download all the image files for your Nexus device.
Since this a developer preview, not all features and functionality are working at the moment. Here is a list of what works according to Canonical -:
  1. Shell and core applications
  2. Connection to the GSM network (on Galaxy Nexus and Nexus 4)
  3. Phone calls and SMS (on Galaxy Nexus and Nexus 4)
  4. Networking via Wifi
  5. Functional camera (front and back)
  6. Device accessible through the Android Developer Bridge tool (adb)
You can also check out the list of device specific issues here.

Preparing for Analytics 3.0

Analytics are not a new idea. The tools have been used in business since the mid-1950s. To be sure, there has been an explosion of interest in the topic, but for the first half-century of activity, the way analytics were pursued in most organizations didn’t change that much. Let’s call the initial era Analytics 1.0. This period, which stretched 55 years from 1954 (when UPS initiated the first corporate analytics group) to about 2009, was characterized by the following attributes:
  • Data sources were relatively small and structured, and came from internal sources;
  • Data had to be stored in enterprise warehouses or marts before analysis;
  • The great majority of analytical activity was descriptive analytics, or reporting;
  • Creating analytical models was a “batch” process often requiring several months;
  • Quantitative analysts were segregated from business people and decisions in “back rooms”;
  • Very few organizations “competed on analytics”—for most, analytics were marginal to their strategy.
It was in 2010 that the world began to take notice of “big data,” and we’ll have to call that the beginning of Analytics 2.0. Big data analytics were quite different from the 1.0 era in many ways. Data was often externally-sourced, and as the big data term suggests, was either very large or unstructured. The fast flow of data meant that it had to be stored and processed rapidly, often with parallel servers running Hadoop. The overall speed of analysis was much faster. Visual analytics—a form of descriptive analytics—still crowded out predictive and prescriptive techniques. The new generation of quantitative analysts was called “data scientists,” and many were not content with working in the back room. Big data and analytics not only informed internal decisions, but also formed the basis for customer-facing products and processes.
Big data, of course, is still a popular concept, and one might think that we’re still in the 2.0 period. However, there is considerable evidence that organizations are entering the Analytics 3.0 world. It’s an environment that combines the best of 1.0 and 2.0—a blend of big data and traditional analytics that yields insights and offerings with speed and impact. Although it’s early days for this new model, the traits of Analytics 3.0 are already apparent:
  • Organizations are combining large and small volumes of data, internal and external sources, and structured and unstructured formats to yield new insights in predictive and prescriptive models;
  • Analytics are supporting both internal decisions and data-based products and services for customers;
  • The Hadoopalooza continues, but often as a way to provide fast and cheap warehousing or persistence and structuring of data before analysis—we’re entering a post-warehousing world;
  • Faster technologies such as in-database and in-memory analytics are being coupled with “agile” analytical methods and machine learning techniques that produce insights at a much faster rate;
  • Many analytical models are being embedded into operational and decision processes, dramatically increasing their speed and impact;
  • Data scientists, who excel at extracting and structuring data, are working with conventional quantitative analysts who excel at modeling it—the combined teams are doing whatever is necessary to get the analytical job done;
  • Companies are beginning to create “Chief Analytics Officer” roles or equivalent titles to oversee the building of analytical capabilities;
  • Tools that support particular decisions are being pushed to the point of decision-making in highly targeted and mobile “analytical apps;”
  • Analytics are now central to many organizations’ strategies; a survey I recently worked on with Deloitte found that 44% of executives feel that analytics are strongly supporting or driving their companies’ strategies.
Even though it hasn’t been long since the advent of Big Data, I believe these attributes add up to a new era. It is clear from my research that organizations—at least the big companies—are not keeping traditional analytics and big data separate, but are combining them to form a new synthesis. Some aspects of Analytics 3.0 will no doubt continue to emerge, but organizations need to begin transitioning now to the new model. There is little doubt that analytics can transform organizations, and the firms that lead the 3.0 charge (like Procter & Gamble , which I wrote about last week) will seize the most value.

Original Article :http://blogs.wsj.com/cio/2013/02/20/preparing-for-analytics-3-0/

Infosys luanches tool for data analysis

BANGALORE: Infosys has launched BigDataEdge to radically simplify the complex task of analyzing Big Data to discover relevant information. By empowering business users to rapidly develop insights from vast amounts of structured and unstructured data, better business decisions can be made in near real-time.

With Infosys BigDataEdge, enterprises can reduce the time taken to extract information by up to 40% and generate insights up to eight times faster. The product includes: a rich visual interface with more than 50 customizable dashboards and 250 built-in algorithms. These algorithms, a set of reusable business rules both function and industry-specific, enable business teams to self-serve the process of building insights while minimizing the need for technical intervention.

It also comes with over 50 data source connectors, which allow easy access to structured and unstructured data residing across enterprise and external sources. This would enable acceleration of discovery of relevant information from existing, underutilized data. It offers a powerful collaboration wall and pre-built workflows that allow teams across functions to interact on insights and collectively implement decisions. A Logical Data Warehouse provides virtual data management architecture, eliminates the need for physical availability of data to build and test insights. It also functions as an `out-of-the-box' application for specific industry needs such as fraud detection and prevention, predictive analytics and monitoring, and customer micro-segmentation that deliver faster returns on investment.

Vishnu Bhat, vice president & global head, cloud, Infosys said, ``Enterprises today cannot afford to spend an inordinate amount of time making sense of the data deluge that surrounds them. Infosys BigDataEdge draws upon our deep research and development capabilities and proven expertise in Big Data and analytics to help clients turn data into revenues faster. This unique platform is already enabling ten global organizations to develop actionable insights in a matter of days and act on them from day one." 


Original Article : http://timesofindia.indiatimes.com/tech/tech-news/software-services/Infosys-luanches-tool-for-data-analysis/articleshow/18598283.cms

Infosys’ new platform pulls big data 40% faster

Infosys on Wednesday formally launched what it says is one of the most comprehensive solutions in the big data space.

The solution, called BigDataEdge, allows enterprises to easily bring together not just the organized or structured data, but also a vast variety of unstructured data (information contained in, say, emails, document files, contracts with customers or vendors, blogs, social media, call centre voice records, videos). It then enables them to glean insights from all of this data, and take appropriate action.

One major element of the solution is a patent-pending connector framework, which automatically connects to internal and external data sources, including any new source that emerges, and which then makes pulling data together very easy.

"We have been able to reduce the time to discover and aggregate data by up to 40%," says Vishnu Bhat, head of Infosys' cloud and big data business. He says that in the case of a financial service provider, the solution was able to uncover hidden exposures in 43% of their accounts by going through all the written contracts. "Earlier, this would at best be a manual process that took many months. Now you can do it in days or weeks," he says.

The solution can convert voice calls into text to find necessary information. It uses facial recognition and similar technologies to extract information from videos.

Enterprises can then use built-in algorithms (there are some 250 of them) to obtain the insight required from a desired set of data sources, and visualize the insight using some 50 customizable dashboards. "We are able to generate insights eight times faster than is normal for enterprises," Bhat says. The solution also includes a collaboration tool that allows users across functions and regions to interact on the insights and take decisions in real time.

Bhat says the solution can even be used for specific requirements such as fraud detection or predicting network failures with its ability to match current records with historical records. For instance, people have a certain pattern of usage of their bank account. If there is a change in that pattern (because of an online fraudster), the solution quickly recognizes that and can send an alert or temporarily lock the account.

BigDataEdge is the latest in Infosys' Edge series of platforms that also includes, among others, WalletEdge, the mobile payments platform, and BrandEdge, which addresses marketing needs. 


Original Article : http://timesofindia.indiatimes.com/tech/enterprise-it/infrastructure/Infosys-new-platform-pulls-big-data-40-faster/articleshow/18603556.cms

Red Hat's Big Data strategy: A full stack approach

Summary: Open source vendor Red Hat announces a Big Data strategy that spans the full enterprise software stack, both in the public cloud and on-premise.

This morning, open source software and infrastructure provider Red Hat announced its Big Data strategy.  ZDNet's Steven J. Vaughan-Nichols covered the news earlier today:

The very occurence of Red Hat's announcement, as well as its multiple facets, marks a new phase for Big Data: one where it has become a matter of mainstream IT infrastructural and app dev concern.
It ain't just Linux
Red Hat Enterprise Linux (RHEL) is arguably Raleigh, North Carolina-based Red Hat's flagship product, but the operating system arena is not by any means its only focus.  Red Hat also has big irons in the storage, cloud and developer fires, and its Big Data strategy announcement addressed all three of these.
Big Data is now a relevant factor in the entire enterprise software stack.
Hybrid cloud
One could argue that the crux of Red Hat's Big Data manifesto focuses on the hybrid cloud.  Red Hat's Big Data narrative entails customers working on Big Data pilots/proofs-of-concept in the public cloud today, with the need to put those projects into production in the on-premises, private cloud in the near future.
I'm not sure if this narrative is quite as univeral as Red Hat would have us believe, but the motivation Red Hat derives from it is nonethless laudable: to make certain that Big Data projects can move seamlessly from the public cloud environment to the private cloud, or vice-versa, without "re-tooling."
What defines the strategy?
In order for that roundtrip to be possible, and in an environment built on Red Hat Enterprise Linux, Red Hat Storage, JBoss Middleware and the OpenShift cloud platform (as well as the OpenStack cloud platform overall), Red Hat announced the following initiatives:
  • It will move its governance of its Red Hat Storage Hadoop adapter, which makes Red Had Storage compatible with Hadoop's Distributed File System (HDFS) to the Apache Software Foundation. This could pave the way for the adapter to be integrated into major Hadoop distributions. And given that Red Hat Storage uses a commodity-hardware-based distributed file system that maintains Hadoop's hallmark data locality, such an outcome wouldn't be unreasonable
  • In order for Red Hat Storage to work effectively in the public cloud, Red Hat will pursue engineering to make Red Hat Storage accomodate multiple tenants
  • Red Hat will fully support the JBoss Midleware Apache Hive Connector, allowing developers on its Java stack to work in a familiar, SQL query-oriented coding environment when working against Hadoop
  • Red Hat will enhance JBoss to interoperate with MongoDB and other (unnamed) NoSQL databases.  It will also support the Open Data Protocol (OData), an open framework (originally developed at Microsoft, but now progressing toward status as an OASIS standard) for exposing data sources as RESTful Web Services in JSON and AtomPub formats. 
Who the strategy involves
Red Hat also announced it will be forging hardware and software partnerships with an eye toward developing a full ecosystem around its Big Data approach.  One deliverable from these partnerships will be reference architectures that the company said could be used as "cookbooks" by enterprises to build out Big Data infrastructure with greater assurance of success.

What it MeansRed Hat rightly pointed out that the majority of Big Data projects are built on open source software (including Linux, Hadoop, and various NoSQL databases) and so it's fitting that such an important company in the open source world as Red Hat would announce its Big Data strategy.
What's especially significant here is that Red Hat is also an Enterprise software company, and it articulated a strategy aimed at making Big Data part of the mainstream Enterprise stable of tools and technologies.  It's a big step in the maturation process for Big Data technology. That maturation will seemingly figure heavily in the tech world in 2013.

Original Article : http://www.zdnet.com/red-hats-big-data-strategy-a-full-stack-approach-7000011574/

I.B.M. to Take Big Step Into Mobile

For I.B.M., mobile computing has come of age. At least, smartphones and tablets may be popular enough to make I.B.M. several billion dollars.
The company is announcing a major mobile initiative involving software, services and partnerships with other large vendors. I.B.M. plans to deploy consultants to give companies mobile shopping strategies, write mobile apps, crunch mobile data and manage a company’s own mobile assets securely.
Thousands of employees have been trained in mobile technologies, I.B.M. says, and corporate millions will be spent on research and acquisitions in coming years. I.B.M. also announced a deal with AT&T to offer software developers access to mobile applications from AT&T’s cloud.
“Mobile is the next big growth play that I.B.M. is going after,” said Michael J. Riegel, the head of mobile strategy. He said his company had made 10 mobile-related acquisitions already, and would have a global research and development team of 160 people dedicated to mobile technology. In 2012 alone, he said, I.B.M. won 125 patents related to mobile.
Despite its roots in computer hardware, I.B.M. long ago moved from the business of selling things like personal computers. Much of its business now comes from higher-value work like software creation. Even its big mainframe computers, like the Jeopardy-winning Watson, are usually sold in conjunction with services and software deals.
The push into mobility comes after forays into Web commerce, data analytics and security. In each case, I.B.M. has taken an approach of signing big contracts for large-scale engagements.
By contrast, newer competitors like Google Analytics and Amazon Web Services aim for smaller sales of technologies like analytics or cloud computing, but on a mass level. I.B.M.’s entry into mobile will test whether companies want a large, pervasive approach for this kind of technology.
I.B.M.’s announcement also signals a realization among many companies that employees and customers are accessing corporate data and services via mobile devices from lots of places, any time of day. This, along with mobile access to cloud computing, is challenging many social and business assumptions.
“Our customers are leaving billions of dollars on the table,” Mr. Riegel said. “They are not getting the productivity gains they could. They need to rethink their customer relations to allow people to access them at any time.”
The move into mobile is one of the first major initiatives by Virginia M. Rometty since she became chief executive in January 2012.
Ms. Rometty previously ran I.B.M.’s Global Services division, and worked on a push to bring I.B.M. into the developing world. Mobile technologies, which are usually cheaper than conventional computers, are expected to be the way billions of people in poorer nations will come online in the next few years.

Original Article :  http://bits.blogs.nytimes.com/2013/02/20/ibm-goes-mobile/

Google launches competition to pick 'Glass Explorers' test group


Sergey Brin wearing Google Glass
Google founder Sergey Brin wearing a Google Glass prototype. Photograph: Carlo Allegri/Reuters
Google is preparing to release 8,000 test pairs of its long-awaited glasses to carefully selected winners of an online competition who will have to pay $1,500 for the privilege.
In an announcement on Wednesday, Google said it was looking for "bold, creative individuals" to help test Glass, the official name for its wearable technology which allows users to take pictures and navigate the web using a built in camera and see-through computer screen.
The company is soliciting applications exclusively through Google+ and Twitter, and thousands of posts were swiftly made on both as users vied for their chance to become "Glass Explorers".
"We'd love to make everyone an Explorer, but we're starting off a bit smaller," Google said in a statement. "We're still in the early stages, and while we can't promise everything will be perfect, we can promise it will be exciting."
Google could also have promised that the experience will not be cheap: the 8,000 selected to be explorers will have to pay $1,500 plus tax for the technology and personally attend a "special pick-up experience" in New York, San Francisco or Los Angeles to collect their prize.
Perhaps concerned that early media reviews of its technology might be harsh, particularly if everything isn't "perfect", Google appears to have decided that early buzz about the glasses is likely to be more positive in the hands of diehard fans who have paid handsomely for the privilege of being involved in the testing process.
Google has been working on its Glass product for years and offered the latest glimpse of how it would work in a video released alongside the competition. Called "How it feels [through Glass]" the footage showed a man skydiving while taking a picture, a different man ruminating in front of a big block of ice before Googling a picture of a tiger and an apparently stressed person looking up flight details as they run through an airport.
Only those living in the US are eligible for the chance to "purchase Glass". To get that opportunity those interested must write a 50-words-or-less post on Google+ or Twitter, pondering what they would do with the technology and using the tag #ifihadglass. There is the option to submit a short video accompanying the application as well as up to five photos. The winner will be chosen by an independent panel who will rate each submission on "creativity", "compelling use", "originality" and "social and spectrum", according to the terms and conditions.
Google has yet to announce a general release date for Glass, and was similarly coy with details as to when it will dole the device out to the 8,000 winners, merely stating that it will it will contact winners in mid-to-late March, with the deadline for applications on 27 February.
The glasses feature a screen in the top-right corner that can display mapping information, pictures, phone contacts and more. Various videos released by Google have suggested that wearers will be able to voice-instruct the glasses to take pictures or seek information on their surroundings. The Glass website suggests the specs will come in five different colours including "tangerine", "shale" and "cotton".
Google is not the only tech company investing in wearable technology. Apple is reportedly developing the iWatch, a wrist-wearable computing device, while the company was granted a patent for wearable computers back in 2009. Oakley's Airwave goggles, released last October, use an inbuilt miniscreen coupled with GPS and Bluetooth to display the wearer's speed, altitude and locate fellow skiers.

Original Article : http://www.guardian.co.uk/technology/2013/feb/20/google-glass-8000-prototypes-online-competition

When Do You Take Off Your Google Glasses?

Google Glasses will soon be available to the public, according to the company. Are we ready?
The wearable computer will let you search as you walk, navigate, record what you see and share it, translate and more. We haven't yet figured out which pictures to share on Facebook or how to make sure we don't tweet while drunk. Now we are going to have to figure out a whole new set of social rules.
Do you take off your Google Glasses in the bathroom, lest folks think you are recording? Is it acceptable to Google someone while you are speaking to them? Do you ask, "Mind if I post the conversation we just had online? I think our friends would love to comment on it".
On one hand, new technologies have always taken some adjustment. Jeff Jarvis writes that "The first serious discussion of a legal right to privacy in the United States did not come until 1890. The reason: the invention of the Kodak camera, which led to a similar moral panic about privacy, with The New York Times decrying “fiendish kodakers,” President Teddy Roosevelt outlawing kodaking in Washington parks, and legislators ready to require opt-in permission from anyone photographed in public. We negotiated our norms and cameras don’t scare us anymore. But now a new technology does."
This adjustment, however, takes time. We haven't yet developed norms to catch up with the many ways that technology and data both enhance and disrupt our lives. I am a responsible father if I check my nanny's driving record before letting my kids get in a car with her. But if I run background checks on the parents in our car pool, well that just seems odd - or does it? What are the rules for who I can text? If I have your cell number and would be comfortable calling, should I be comfortable texting? That circle seems a bit smaller, but what are the new social rules?
Behavior that violates social technology rules seems to be "creepy", for want of a more nuanced term. It's easy to blame Google or Facebook or the companies bringing us these technologies for the challenges society needs to now grapple with. But these advances are inevitable as science progresses. The access to knowledge these companies bring to the world is empowering individuals to challenge governments and bringing education to the impoverished. I guess we can put up with a bit of tension until we frame a new set of norms.
In the meantime, just do your best to not be creepy.

Original Article : http://www.linkedin.com/today/post/article/20130221045735-258347-when-do-you-take-off-your-google-glasses

This Company Will Only Accept 'Twitter Resumes' For A Six-Figure

By now, many employers think that who you are online is more revealing of your character than a résumé.
Some companies have decided to stop accepting paper résumés altogether.
"The paper résumé is dead," Vala Afshar, chief marketing officer at Enterasys," told Bruce Horovitz at USA Today. "The Web is your résumé. Social networks are your mass references."
For the next month, Enterasys — a wireless network provider — will be considering applicants for a six-figure senior social media position, but no paper résumés will be accepted. Instead, the company has decided to recruit solely via Twitter.
Jennifer Grabowski, a spokesperson for the company, tells us that candidates need to have a minimum Klout score above 60, a minimum Kred influence score of 725, a Kred outreach of at least eight, and more than 1,000 active Twitter followers in order to be considered. Enterasys is hoping to fill the position by April.
Although solely focusing on a candidate's tweets is still an uncommon recruiting tactic, more are heading that direction. Rachel Emma Silverman at The Wall Street Journal reported that when Union Square Ventures needed an investment analyst last year, the VC firm asked applicants to send in links that represented their "web presence," such as social media or blog accounts. 
It's true that Union Square Ventures invests heavily in Internet companies, but everyone should be paying close attention to their online presence. That means keeping your LinkedIn profile updated, setting up a Google alert for your name, and using Twitter strategically.
Rosa E. Vargas at Careerealism wrote that on Twitter it's important to use "jargon / keywords specific to your target industry" for a competitive edge, and include hashtags with keywords in your tweets so that you're more likely to pop up in searches.

Microsoft's Hotmail To Be Killed By Early Summer, Replaced By Outlook

Still got a Hotmail address? You'll be pleasantly surprised in a few months

is finally taking Hotmail off the life-support machine it's been wheezing away on for so long. The reason? After six months, the software giant's Outlook has been such a howling success--60 million users in six months according to my former colleague, NBC's Wilson Rothman--that the decision was not a hard one to make. The original email system of the masses until it was usurped by an email service that actually worked (take a bow, Gmail) Hotmail has been limping along for years. But not any more. Microsoft is to start auto-updating its users' Hotmail accounts--all 350 million of them--to the Outlook system, and the firm expects the migration to be completed by early Summer.

Outlook's myriad little touches, such as its sweep-up facility, which washes spam clean away and its highly sociable character have made it a popular alternative on the free mail front. Microsoft is hoping it will blow a hole in Gmail's current market dominance, and so is upping its profile with a social-centric ad campaign, which you can see right here.

 

If you can't wait to update your Hotmail account, it's easy to do, says Rothman, just click on the Settings button and then Convert to Outlook. Any Gmail users tempted to switch to Outlook now? Or do Google's tools outweigh Microsoft's social ways?

Yahoo goes social, teams with Facebook for site revamp


* Mayer's biggest overhaul to Yahoo's Internet shop window
* Will import data on Facebook users, such as shared content
* Changes to be rolled out over coming days
* Can this makeover can win back Web audience?
By Nicola Leske
Feb 20 (Reuters) - Yahoo Inc is overhauling its website to incorporate features familiar to Facebook users such as a newsfeed and people's "likes," in CEO Marissa Mayer's biggest product revamp since taking the helm of the ailing company last year.
Mayer, who took over in July after a procession of CEOs was shown the door, said in a blog post on Wednesday that Yahoo's redesigned website will let users log in with their Facebook IDs to gain access to content and information shared by friends - from articles and videos to birthdays.
Yahoo is one of the world's most-visited online properties, but revenue has declined in recent years amid competition from Google Inc and Facebook Inc.
The changes to Yahoo's Internet shop window, which include a more streamlined mobile application for smartphones and tablets, will be rolled out over coming days. The makeover follows a new version of Yahoo mail, one of its most popular applications, introduced in December.
Analysts say the move marks a strengthening of Yahoo's ties with Facebook, employing some of the social network's growing data on its billion-plus users to battle Google for Web users' attention. It remains to be seen whether the initial makeover and tweaks expected over time will win back its Internet audience.
"This is definitely an important step. The Yahoo home page is one of the most important things because it is the first interface," said B. Riley Caris analyst Sameet Sinha. "It's familiar in terms of layout, the newsfeed is interesting, and it will be interesting to see how it develops over time.
"The key will be how data is aggregated within Yahoo and Facebook."
TUMULT, TRANSITION
Seven months into her tenure, former Google executive Mayer has arrested the decline of the Internet portal and won favor on Wall Street with stock buybacks among other things. But Yahoo's forecast of a modest revenue uptick this year still pales in comparison with the growth of rivals like Google and Facebook, which are eating into its advertising market share.
"We wanted it to be familiar but also wanted it to embrace some of the modern paradigms of the Web," Mayer said of the product revamp on NBC's "Today" show on Wednesday.
"One thing that I really like is this very personalized newsfeed; it's infinite and you can go on scrolling forever," she said.
Among other problems, Yahoo has been plagued by internal turmoil that has resulted in a revolving door of CEOs. Mayer, 37, took over after a tumultuous period during which former CEO Scott Thompson resigned after less than six months on the job over a controversy about his academic credentials. Yahoo co-founder Jerry Yang then resigned from the board and cut ties with the company.
Thompson's predecessor, the controversial and outspoken Carol Bartz, was fired over the phone for failing to deliver on growth. Yahoo's 2012 revenue was $5 billion. It has been flat year over year, off from some $6.3 billion in 2010.
Yahoo shares were down 0.3 percent at $21.22 at midday on Wednesday on the Nasdaq.

Original Article : http://www.reuters.com/article/2013/02/20/yahoo-website-idUSL1N0BK4UL20130220

Samsung Announces WiFi-Only Galaxy Camera

Last year, Samsung launched its first Android powered digital camera. This device offers various connectivity options including 3G as well as 4G LTE in select markets. The Samsung Galaxy Camera was shipped with the newer Android 4.1 (Jelly Bean) Operating System. Today, Samsung went ahead and announced the new WiFi-only Galaxy Camera.
As the name suggest, the latest variant of Galaxy Camera won’t pack a cellular radio. You will need to find a stable Wi-Fi connection before uploading your photos to Facebook, Instagram and other social networking sites. Apart from that, there are no major changes in the specs of this device.
samsung galaxy camera wifi
Samsung Galaxy Camera features a large 4.8 inch HD SLCD display, 1.4 GHz quad-core processor, Android 4.1.2 (Jelly Bean) OS, 16-megapixel BSI CMOS sensor, 21x zoom lens, Full HD (1080p) video recording and playback, GPS with A-GPS, HDMI, 8 GB internal memory, MicroSD card slot, 32 GB expandable memory, Google Play Store, Samsung Hub, Wi-Fi 802.11 b/g/n and a 1650mAh battery.
Samsung has not yet announced the price of this device. However, the price is expected to be much lower than the previously launched Galaxy Camera with 3G/4G connectivity. This handset will go on sale in the coming weeks.


Original Source :http://techie-buzz.com/gadgets-news/samsung-galaxy-camera-wifi-price-specs.html

HTC One Unveiled – Snapdragon 600 Processor, Sense 5, BoomSound And UltraPixel Camera

After quite a few leaks in the last few hours, HTC has finally officially announced its latest flagship – the One. The One is a typical HTC handset with an unibody aluminium fit-and-finish. The handset sports a 4.7-inch Super LCD 3 display with a resolution of 1080p (1920*1080) and a ppi of 468. The front of the handset is protected by the Gorilla Glass 2 interrupted by the dual-stereo speakers that come with their own dedicated amps for BoomSound.

Inside, the One is among the first Android smartphones to make use of Qualcomm’s Snapdragon 600 SoC clocked at 1.7GHz and an Adreno 320 GPU. There is also 2GB of RAM, and 32 or 64GB of on-board memory. The usual bunch of connectivity features and sensors are also present, except for a microSD card slot.
On the software side, HTC has blessed the One with Android 4.1.2 Jelly Bean and a brand new version of Sense. In its fifth version, the Sense homescreen has BlinkFeed that aggregates all your social networking updates and notification right on your homescreen. The homescreen design has definitely been inspired from Flipboard.
A few unique features of the One include a 2.1MP front-facing camera with an 88-degree wide-angle view, first seen on the 8X, an IR blaster that is integrated into the Power button of the handset, and a brand new UltraPixel camera at the back. The One ‘only’ comes with a 4MP f/2.0 camera at its back. However, the sensor is capable of capturing 300 percent more light thus allowing for some stunning pictures. It is also capable of recording 1080p videos at 60FPS, and features HTC’s ImageChip 2 along with Optical Image Stabilisation. HTC has also simplified the UI by only keeping the back and home button, and totally removing the Recents app button.
The HTC One will be available in the second half of March in silver or black across 185 carriers all over the world including T-Mobile, Sprint and AT&T in the United States.

Why Business Intelligence Software Is Failing Business

BI Technology ConsiderationsBusiness intelligence software is supposed to help businesses access and analyze data and communicate analytics and metrics. I have witnessed improvements to BI software over the years, from mobile and collaboration to interactive discovery and visualization, and our Value Index for Business Intelligence finds a mature set of technology vendors and products. But even as these products mature in capabilities, the majority lack features that would make them easy to use.  Our recent research on next-generation business intelligence found that usability is the most important evaluation criteria for BI technology, outpacing functionality (49%) and even manageability (47%). The pathetic state of dashboards and the stupidity of KPI illustrate some of the obvious ways the software needs to improve for businesses to gain the most value from it. We need smarter business intelligence, and that means not just more advanced sets of capabilities that are designed for the analysts, but software designed for those who need to use BI information.
Our research finds the need to collaborate and share (67%) and inform and deliver (61%) are in the top five evaluation categories for software. A few communication improvements, highlighted below, would help organizations better utilize analytics and BI information.
Personalized Notifications and Alerts
Everyone in business is busy. Managers and directors have little time to do analysis. Yet BI advancements in visual and data discovery are focused on analysts, and not aimed at the majority of those in business who need to be notified of issues critical to business processes for which they are responsible. We need to make it simpler to consume BI from any point of a presentation. Individuals should be able to review lists of critical metrics as easily as they can browse a directory of files, and based on access rights should be able to select and scope them to suit their areas of responsibility (geography/location, customer, products) and the time period needed for analysis. Users should be able to compare ranges across time and set thresholds that trigger a notification or alert delivered into email or directly to a mobile platform (smartphone or tablet).
This notion of personalized notifications is pretty simple, and our recent research on next-generation business intelligence found that alerts and notifications are the most important capabilities for mobile technology according to 42 percent of organizations. Business Intelligence software vendors must not just focus on providing charts to mobile technology but rather on providing better access to information, and should offer notifications of changes to the analytics and metrics that matter. Personalized notifications should be a self-service activity that does not require IT or analysts to get involved; the software should be usable and smart enough to perform this basic business function.
Text Presentation of Analytics
BI has focused on presenting tables of data and charts to visualize data. More advanced deployments blend in maps and location analytics, a feature whose value I have already espoused. But even with dashboards, individuals can find it challenging to look at four or more charts with no context in communication about them. Even as we try to make the presentation of charts through visualization fancier, the problem is that majority of business professionals are not trained to interpret charts and would rather read what is going on in their business just as they read the newspaper or digital forms of it on their tablet. We can read summary paragraphs about the news; business should be able to see similar communication about the analytics that matter to an individual’s role, yet today’s business intelligence software lacks any way of presenting data in readable text or their natural language. This needs to be part of business intelligence, and organizations need to voice their needs and ensure that software companies understand how this would help them use analytics.
Make Observations on Analytics
Analysts are responsible for conducting analysis, providing observations found in the analytics and communicating them to others in business. Despite the variety of business intelligence software available, the majority of analysts today place charts and graphs into a presentation using technology like Microsoft PowerPoint. Once a chart is placed into a presentation, analysts place observations as bullet points next to the chart, title the slide, and add summary points or actions in a text box at the bottom. They then repeat the process to create a collection of slides that are usually exported to Adobe Acrobat and emailed to others.
Now step back and ask yourself if your BI software supports those kinds of operations today. Does your dashboard of charts provide the ability to place text around it for communication? Can the text from an analyst be fixed and not change, but be commented on or collaborated on in a discussion forum? From my analysis of the majority of software providers, the answer is no. It is not clear why vendors fail to support the basic process of analysis, observation and notation on the analytics and metrics. You should be able to add a free-form text box to the left or right of a chart and add observations that can be fixed and placed into a dashboard.
Top Ranked Technology PrioritiesThese areas are just some examples of ways to improve business intelligence. I hope that BI software providers start to add more communication and collaboration capabilities that adapt to the way people work, rather than the current approach that forces people to spend more time in their products. As a customer, you should voice the needs you have to support your efforts today. For years, BI software providers have stated there was no demand for capabilities like collaborative and mobile to make things easier for business users, but my analysis indicates this was because they were getting feedback only from IT organizations and IT industry analysts who do not research or understand the way business professionals operate and how they want to become smarter in how they communicate and collaborate, including via mobile technology.  Our benchmark research finds collaborative capabilities have been important for some time, and our latest research in 2012 finds them very important to 26 percent of organizations and important to another 41 percent. In addition, more than a third (38%) would like collaborative support as part of their business intelligence product, and almost a quarter (24%) has no preference in their approach. If the vendors do not add this capability, businesses will continue to use Microsoft Office (36%) or find a stand-alone collaboration tool (17%) to meet their need. The need for collaboration is very clear as it is the second-ranked priority (16%) we found in our technology innovation benchmark research, after analytics (39%).
Your BI software should support all business roles, not just analysts, though they do need smarter tools such as visual discovery. IT departments should examine why business still uses spreadsheets and presentation software instead of BI software. To get a shared enterprise approach to business intelligence, business must have software that supports the analytic and decision processes we have been successful with for decades, as well as what people need today. By addressing these needs we will also be more prepared for investments in big data, which our research finds are expected to improve communications and knowledge sharing. Our analysis finds the need for more social and collaborative BI features to help teams and support the business and is a key component of my colleagues' business analytics research agenda. Assess what you are doing to optimize business intelligence software for business use and provide better communication and easier consumption of analytics and metrics. If you are not able to get what you need, maybe it is time to switch BI software providers to the ones that do provide what you need for business.

Original article