How to Align Your New Solution with Business Needs

It’s wonderful that the BI toolkit is constantly expanding with new and better tools. Increased computing power and enhanced software solutions enable companies to deal with unprecedented volumes of data and provide access to information to practically anyone who needs or wants it. However, the wide variety of solution components to choose from makes the decision process to select the right tool all the more complex.
Additionally, with more users needing access, accommodating different types of users introduces further challenges when designing the solution. In short, more choices and more users often result in a solution design that provides less – as in less-than-satisfactory – alignment with business needs and goals. But by systematically assessing the key drivers of solution design, you can determine the solution that will truly align with your business needs, in spite of more choices and more users.
Let’s start by taking a look at the key solution components. Keep in mind that BI solutions are built to integrate data from multiple sources and make that integrated data set available for analysis. Based on this, our BI solution has three main components: data extraction and movement (usually referred to as ETL), data storage and integration (i.e., database) and reporting and analysis. Technology will need to be selected to handle each of these areas and to integrate with each of the other components as well. To select the most appropriate technologies for your environment, you will need to look at each of the key drivers and determine which technologies best address those drivers.
There are several other components that may be needed for your solution: metadata management, security, data quality and scheduling, to name a few. While this article focuses on the three main components, a similar approach should be applied for any additional areas of your solution.
So what are the drivers that will help us align our solution to our business needs? The categories I use are:
  • Requirements
  • Data
  • Developer skills
  • End user skills
Another way to think about these drivers is by considering the following questions: What does the solution need to do? What raw material (data) do I have to work with? What experience does the development team have in this area? How will people use the solution?
By combining an assessment of each of these areas with knowledge of the available technology options, the best choice can be made for each component of the solution. So let’s look at some of the most common facets of each of these components and consider how they play into your technology selection. By doing this, you will be well on your way to aligning your solution with the business needs.

Requirements

The most important thing to know about your proposed solution is what it is intended to do. Does it need to integrate hundreds of sources of data? Will it be receiving real-time feeds that need to be analyzed, formatted and sent in near real time to end users? Does it need to handle petabytes of unstructured data? Will the solution be running numerous reports each week that will be emailed to a set of end users? How many end users will it support -- hundreds or just a few?
These questions will help you narrow down your choices to a manageable number. You should start by answering some of the bigger architectural questions, such as determining whether you need a traditional relational database solution or a data warehouse appliance solution. Do you want to have your hardware in house or in the cloud? Will your solution require a small data mart or big data architecture? Huge amounts of data, such as sensor data, could lead you to a big data solution. Producing a handful of accounting reports is typically a small data mart solution.
A BI architect will know how to read your requirements and help steer the solution in the right direction. Some of these choices will also feed into your technology selections as well, so be sure to include them in your selection process.
The list of potential questions for BI solutions is nearly endless, so it is important to be able to gather a sufficient set of requirements and understand how to categorize and prioritize them in order to design an achievable solution, rather than trying to nail down every single possible requirement under the sun.
You will need to identify which requirements will impact which of the main components of your solution: ETL, database and reporting/analytics. If you can build that list, you should then be able to match your requirements to the capabilities of tools in each of those categories. Once a short list of the tools is made, a more rigorous selection process can be launched that considers factors like price and user friendliness.

Data

Data is the raw material that you will be using in your solution. The data will be brought into your solution (likely via ETL processes), conformed, integrated, aggregated, summarized, organized (in the data warehouse and marts) and made available for reporting and analysis (using reporting/analytics/data visualization tools).
The data must be assessed and understood in order to design an appropriate solution. The data will determine if your solution will fall into the big data category. Will there be enough volume, variety and velocity to qualify? Will the data be coming in real time or in periodic batches? Will data integration be complex or straightforward? Will the users understand the data or will it be too complex for them? The answers to these questions will help you determine the best tools to manage your data.
At a higher level, an overall data strategy also needs to be understood and addressed. The premise for the data strategy is that the source systems contain or create the data needed for analysis, and can move that data effectively to the data warehouse. A data strategy is especially important when implementing a major new software solution, especially an ERP system.
As with most solutions, start with the end in mind. Do your best to anticipate what analysis will need to be done after the new solution is implemented. Figure out what data will be needed to do that analysis. Then be sure the new solution can capture or create that data at the right level of granularity. After the new software has been deployed, you really don’t want to hear, “If you had told us you needed data at the daily level when we were installing the software we could’ve given it to you, but the system only captures data at the weekly level because of the way we configured it.”

Developer Skills

One thing that often gets overlooked when selecting tools is the skill set of the existing development team. If you already have tools in house that your developers are familiar with, start by assuming they will be your tools of choice unless you can find a good reason to use something else. If a new tool is needed, people will need to be trained and, more importantly, gain experience with the new technology. You should also consider the degree of difficulty in training employees or finding new employees who possess the desired skill set. You could paint yourself into a corner by selecting a really nifty, inexpensive tool and then not being able to find anyone with the expertise to implement or support it.

End Users Skills

Remember that the best measure of success of a BI solution is the extent to which it is adopted by the end users. With that in mind, what is the profile of your end users? Typically, end users fall into three categories: those with little technical skill who only want to see completed reports; those who are capable of some manipulation and prefer parameter-driven reports; and those who are technically savvy, are intimate with the data and prefer to build their own reports and analysis from scratch. Find out what your user community looks like and select tools that they can use.  Most of the today’s market-leading tools can accommodate all three types of users and it is simply up to you to build it out in such a way that it will accommodate your user community.

Conclusion

Because BI solutions are typically built over a long period of time with the primary goal of supporting business users, it is vital that the solution owner works closely and continuously with the business community to deliver a solution that aligns with both the business needs and the business capabilities. You want the solution to be used, so design a solution that makes it as easy as possible for your team to build it and the business community to use it.  That way, they’ll be sure to come back for more.

Source :http://www.information-management.com/news/how-to-align-your-new-solution-with-business-needs-10023855-1.html?zkPrintable=1&nopagination=1

Apple’s 2013 Supplier Responsibility Report Includes 72% Bump In Audits For 2012, 97% Increase In Training

accountability_audits_2xApple has released its 2013 Supplier Responsibility Progress Report, and it features a number of updates from last year, including Apple’s decision to join the Fair Labor Association (a notable first), and conduct audits of its suppliers in tandem with that outside watchdog organization. The results seem to be a tightening of Apple’s code of conduct for suppliers all around, in terms of monitoring, penalties and programs to improve conditions.
Apple conducted 72 percent more audits in 2012 than it did in 2011, for example, totaling 393 audits across facilities employing 1.5 million workers. All types of audits increased for the year, including firs-time, repeat, process safety assessments and specialized environmental audits, but the last one took the biggest jumps vs. previous years. In 2012, Apple conducted 55 focused environmental audits, which is a 293 percent increase over the number it ran in 2011. The Mac maker works with outside associations in this area, too, just as it does with the FLA regarding labor, including the Natural Defense Council, the EPA and the Institute of Public and Environmental Affairs.
The supplier audits also actually resulted in more severe punitive action than usual. Apple has faced criticism in the past for doling out corrective measures that seem rather toothless – most often putting suppliers “on probation,” meaning they’ll be watched more closely for future violations. But one supplier fell afoul of Apple’s measures to protect against underage labor, with 74 cases counted at a single facility. Apple terminated the relationship with that offending party entirely, proving that there are real consequences for companies that ignore its code of conduct and local labor laws.
Apple also came down harder on companies for compliance with working hour regulations, and changed its policies and practices in monitoring them to be more effective. In 2012, Apple started doing real-time work hour tracking on a weekly basis for over 1 million of the employees at its supplier companies, and publishing data on its progress every month. That led to a 92 percent compliance rate with its 60 hour maximum work week, as laid out in the Apple Supplier Code of Conduct, and Apple says overall work weeks averaged less than 50 hours.
apple-education-programsAnother area of improvement for Apple was in participation in its training and education programs. There were 1.32 million workers trained on local laws, worker rights, health and safety and Apple’s own Code of Conduct during 2012, a 97 percent increase over 2011′s 670,000. Apple also provided more free educational opportunities to workers than ever before, with 201,000 cumulative participants in those programs, up 235 percent from 60,000 in 2011.
Apple’s transparency definitely improved over the course of 2012 when it comes to its efforts around supplier responsibility and maintaining healthy and safe work environments, and that’s something Apple CEO Tim Cook clearly undertook as a conscious effort. That’s not to say that Apple didn’t have its fair share of labor issues during the year (issues around the demanding requirements for building the iPhone 5 come to mind), but especially in the way that Apple has allowed disinterested third parties to come in and aid with its monitoring efforts, 2012 was definitely the most significant year yet in terms of improvements made to its stance on supplier responsibility.

Global Smartphone Shipments Reach a Record 700 Million Units in 2012

Boston, MA - January 24, 2013 -- According to the latest research from Strategy Analytics, global smartphone shipments grew 43 percent annually to reach a record 700 million units in 2012. Samsung was the star performer, capturing 30 percent marketshare worldwide and extending its lead over Apple and Nokia.
Neil Shah, Senior Analyst at Strategy Analytics, said, ?Global smartphone shipments grew 38 percent annually from 157.0 million units in Q4 2011 to 217.0 million in Q4 2012. Global smartphone shipments for the full year reached a record 700.1 million units in 2012, increasing robustly from 490.5 million units in 2011. Global shipment growth slowed from 64 percent in 2011 to 43 percent in 2012 as penetration of smartphones began to mature in developed regions such as North America and Western Europe.
Neil Mawston, Executive Director at Strategy Analytics, added, ?Samsung shipped a record 213.0 million smartphones worldwide and captured 30 percent marketshare in 2012. This was the largest number of units ever shipped by a smartphone vendor in a single year, beating Nokia?s previous all-time record when it shipped 100.1 million units during 2010. Despite tough competition in stores and courtrooms, Samsung continued to deliver numerous hit models, from the high-end Galaxy Note2 phablet to the mass-market Galaxy Y. Apple grew a healthy 46 percent annually and shipped 135.8 million smartphones worldwide for 19 percent marketshare in 2012, broadly flat from the 19 percent level recorded in 2011. Apple had a strong year in developed regions like North America, but this was offset partly by its limited presence in high-growth emerging markets such as Africa.
Linda Sui, Analyst at Strategy Analytics, added, ?Samsung and Apple together accounted for half of all smartphones shipped worldwide in 2012. Large marketing budgets, extensive distribution channels and attractive product portfolios have enabled Samsung and Apple to tighten their grip on the smartphone industry. The growth of Samsung and Apple has continued to impact Nokia. Nokia retained its position as the world?s third largest smartphone vendor for full-year 2012, but its global marketshare has dropped sharply from 16 percent to five percent during the past year. Nokia?s Windows Phone portfolio has improved significantly in recent months, with new models like the Lumia 920, but we believe the vendor still lacks a true hero model in its range that can be considered an Apple iPhone or Samsung S3 killer.?
Exhibit 1: Global Smartphone Vendor Shipments and Market Share in Q4 2012.

Global Smartphone Vendor Shipments (Millions of Units)
Q4 '11
2011
Q4 '12
2012
Samsung
36.5
97.4
63.0
213.0
Apple
37.0
93.0
47.8
135.8
Nokia
19.6
77.3
6.6
35.0
Others
63.9
222.8
99.6
316.3
Total
157.0
490.5
217.0
700.1





Global Smartphone Vendor Marketshare %
Q4 '11
2011
Q4 '12
2012
Samsung
23.2%
19.9%
29.0%
30.4%
Apple
23.6%
19.0%
22.0%
19.4%
Nokia
12.5%
15.8%
3.0%
5.0%
Others
40.7%
45.4%
45.9%
45.2%
Total
100.0%
100.0%
100.0%
100.0%





Total Growth Year-over-Year %
55.9%
63.8%
38.2%
42.7%

Original Source : http://blogs.strategyanalytics.com/WSS/post/2013/01/25/Global-Smartphone-Shipments-Reach-a-Record-700-Million-Units-in-2012.aspx

YC-Backed Segment.io Lets Developers Integrate With Multiple Analytics Providers In Hours, Not Weeks

Segment.io, a Y Combinator-backed analytics startup for developers, offers an easier way for developers to integrate the APIs from multiple analytics providers into their own applications. The service currently supports 20 analytics providers, including those from Google, KISSmetrics, Mixpanel, Chartbeat and more, as well as enterprise providers like HubSpot and Salesforce.
Currently, both client-side and server-side analytics are supported, and the company is planning to release a mobile solution in the future.
The company’s founders, Peter Reinhardt, Ilya Volodarsky, Ian Storm Taylor, and Calvin French-Owen, three of whom started off as roommates while at MIT (Ian was at Rhode Island’s School of Design), all dropped out to participate in Y Combinator’s summer 2011 program. Originally, the team was focused on building a competitor to Google Analytics or KISSmetrics, but they had trouble getting people to integrate with their service. However, the team had also build a library called Analytics.js that wrapped all the analytic services and APIs together, and released an open source version on GitHub.
“The open source version started growing by itself,” says Reinhardt, “so after a couple of months, we decided to pivot and build what people seemed to actually want: a beautiful, simple analytics API.” They made that shift in December. He says that around 1,800 developers have started with the open source version, which the company has no plans to take down. But in December, the team began working on building a premium version, and today, has had thousands of signups and over 300 active projects using the service.
The problem, Reinhardt explains, is that each API out there is slightly different, so businesses will want to use them for different things – one for tracking referrals, another for custom event tracking, a third for targeted emails, etc., etc. “A developer will have to sit down and figure out all these APIs, which is kind of a nightmare because they don’t feel like they’re working on the product,” he says.
Instead, with Segment.io, the developer can just use a simple API that works across all providers. In addition, it makes adding and removing APIs easy, too, saving tons of development time. Reinhardt says integrations might only take a couple of hours, for example. For companies which have previously had to integrate with the APIs from providers like Salesforce or Marketo, that time saving is even more impressive. ”They have these really nasty, old APIs that are based on SOAP XML instead of REST APIs,” Reinhardt says. Some of Segment.io’s customers spent months integrating with those providers in the past, and now it takes them hours, he tells us.
Segmentio-example
The company previously only offered a browser JavaScript library, but this Tuesday, it launched libraries for Ruby, Node.js, Python, Java and .NET. They’re now focused on adding support for PHP, which is users’ top request. Plus, they’ll be rolling out support for more analytics providers going forward at a rate of two to three per week. The next one out of the gate is Pardot, which will arrive next week.
Reinhardt says the client-side, hosted version (basically, Analytics.js) will always be free, but developers will pay to use the server-side libraries. The “startup” plan is $30 per month for 1 million server-side API calls, and the premium tier is $150 per month for 10 million server-side calls, and access to premium integrations including HubSpot, Marketo, Omniture, and Salesforce. Both paid versions include email support as well.
integrations_small
While at the end of the day, Segment.io is about saving development time, Reinhardt explains that the impacts it will have on the business as a whole are even broader. “It’s an investment [from the company] in being data-driven,” he says. If they use the simplest possible API and they make it really easy to add other analytics services, then it’s really easy to make changes. “It removes the barrier to using the analytics solutions they should be using, but aren’t, because it’s a pain to set up,” he says.
Developers can sign up for Analytics.js here.

Source :
http://techcrunch.com/2013/01/25/yc-backed-segment-io-lets-developers-integrate-with-multiple-analytics-providers-in-hours-not-weeks/

Social Media and Business Intelligence: Creating the Integrated Customer Hub

The following is an excerpt from “Social Media and Business Intelligence: Creating the Integrated Customer Hub” an exclusive whitepaper brought to you by Oracle. Download the complete whitepaper now and learn how by creating a 360-degree view of your customers, you can equip organizational teams with the intelligence they need to successfully engage with them.
Introduction
In the beginning, individuals monitored social media conversations via RSS feeds or by manually reviewing social media platforms for mentions of their companies, brands, or offerings. Unfortunately, this approach was not scalable, and over time the adoption of social media tools and platforms has outstripped companies’ ability to derive real-time consumer insights and to effectively engage with social customers in a relevant and timely manner.
As social media use has grown, an urgent need has emerged to correlate the information generated through social data with existing consumer information, and integrate it with sophisticated data management systems. No longer is social the sole purview of the marketing or PR group within an organization. Today, the insights derived from social media are as relevant to customer service as they are to engineering. By creating a 360-degree view of your customers, you can equip organizational teams with the intelligence they need to successfully engage with them. By optimizing your social strategy to leverage both social insights and existing private data, you enable your organization to create outreach efforts, new products, and campaigns grounded in real-time, repeatable, automated, and scalable analysis.
A New Age for Content
Never before have so many individuals been involved in the production, communication, and sharing of content—making this a disruptive time for traditional industries (like advertising, news, and entertainment), which find themselves vying with consumer-created content for customers’ attention.
To compete effectively in this new landscape, organizations must monitor and analyze the conversations taking place over social media. This is where their customers are. And this is where they need to be—either participating in or monitoring those conversations. Companies that fail to do so are missing out on consumer insights and opportunities to heighten brand awareness.
With social media now a key component of most organizations’ business and outreach strategies—and the volume of social data steadily rising—rudimentary analytics technology and manual reviews of social media platforms no longer suffice. Organizations need a new approach to monitoring the conversations taking place over social media, and Oracle Social Engagement and Monitoring Cloud Service provides it.
 Superior Approach
The semantic search and analytics technology at the heart of Oracle Social Engagement and Monitoring Cloud Service enables it to automatically capture consumer “considerations and preference” metrics as well as insights from consumer-generated content in social media and structured and unstructured data environments. Some solutions use Boolean or keywords to analyze information but are unable to disambiguate the meaning of terms such as (the shoes) and (the reptile), and others rely on natural-language processing (NLP, a time-consuming and complex language-modeling approach) to disambiguate content. Oracle Social Engagement and Monitoring Cloud Service uses advanced statistical language modeling to address the inaccuracy and bluntness of keyword search and the speed and cost disadvantages of NLP techniques.
 
Figure 1. Oracle Social Engagement and Monitoring Cloud Service uses advanced statistical language modeling.
The semantic engine in Oracle Social Engagement and Monitoring Cloud Service is based on latent semantic analysis (LSA), which allows meaning to be derived from social media conversations and private data. By using sophisticated language-modeling technology, Oracle Social Engagement and Monitoring Cloud Service is able to achieve a high degree of accuracy, uncovering a consumer’s true considerations and preferences as they relate to lifestyle, category, brand, product, and campaign.

Article Source :
http://smartdatacollective.com/99301/social-media-and-business-intelligence-creating-integrated-customer-hub