2012 Illinois Technology Association - CityLights Finalist

BLOGS

MPS Partners provides functional and technical expertise and insights into business process management trends and Microsoft technologies.

Contact Us
HOT JOBS
Interested in reviewing our current job openings or submitting a resume?
CLICK HERE
Home » Blog

The Agile BI Platform

The Agile BI Platform

When you hear the words Agile and Business Intelligence (BI) in the same sentence, what comes to mind?  Is it simply just the application of agile methodologies and concepts to your BI projects or is it something more?  Agile is defined as: “able to move quickly and easily” it is characterized by being “quick, resourceful and adaptable“.  So what does this mean with respect to BI?  It means being able to quickly respond to constantly changing information and analytic requirements.  Being able to visualize data such that insights can be quickly discerned and the speed and quality of decisions made based upon that data is increased.  Having the tools and platforms in addition to appropriate practices in place that enable agility.

Background

In order to understand the impact of agile BI on an organization it is necessary to establish a foundation upon how current BI solutions are developed and deployed and how traditional BI toolsets interact with these solutions. The following diagram depicts the classic approach to developing and deploying BI solutions from a data perspective:

The primary issue with this approach is that there is typically a long cycle time between when the requirements are defined, data sources are identified and a usable deliverable is produced. This data process is often overplayed with classic waterfall approach.

The above processes may be performed for each data mart following a bottom up approach to define the data sources or at the data source level to define the data marts. In either case there is a great deal of inertia in the solution development and deployment processes. That is not to say for enterprise wide BI solutions that the above approaches should be completely abandoned but that they should be augmented to increase their efficiency. Another concern is that in some instances, for a BI point solution, a lot of the required data may not be in enterprise systems. Referring to my previous agile BI Overview blog entry:

76% of all data needed in a reporting solution already exists in standalone data bases or spreadsheets

This is significant in that, the quantity of data available to content producers and information consumers may be sufficient to meet their analytic and reporting requirements. Also, users are placing more demands on IT departments with shrinking resources. They are also asking for self-service, reduced cycle times for BI solution deployments, advanced data visualizations to enable faster tactical and strategic decision making.

With regards to agile BI, one of the core components that is an enabler is the BI platform itself. The choice of a proper BI platform with an appropriate approach to BI solution and development, the foundation for an agile BI environment can be realized.

The Agile BI Platform

Now that we have an understanding of the current state, how do we proceed forward with agile BI and what is the impact of the BI platform? With the need for rapid results and adaptability, the focus should be on delivering incremental value within a short duration. The identified solutions should be very focused and evaluated based on ease of implementation versus overall value.

 

Delivering incremental value within a short duration keeps the business involved as results are readily visible. This reduces risk/cost because issues tend to be surfaced earlier due to the smaller units of work.

Considering that the content creator/information consumer has direct access to a large quantity of data, the BI platform should have the ability to easily connect to multiple heterogeneous data sources simultaneously. This will allow for rapid prototyping of potential solutions. During this time data visualizations can be produce to provide additional insights and facilitate data exploration. The types of questions asked about the data can serve to define an overall approach for defining an enterprise architecture at a later date. This also means that there will be a much deeper collaboration between the content creator, information consumer and IT (co-location). The above process can be visualized as follows and represent the “Definition, Discover and Evaluation” phases. Also note that at any time a previous phase may be revisited as required.

What has not been not been discussed is a requirement for traditional ETL, data marts and the data warehouse. ETL may or may not be required but it should not be overlooked. During the data discovery process any requirements for data transformation or data quality issues will be uncovered. Minor issues can be addressed by the BI platform, as most modern platforms provide minimal data transformation capabilities that can be leveraged during the data integration and consolidation phase. More significant issues may require a combined effort by IT and the business to remediation. The key point to understand is that information consumers are still receiving value and are able to move forward with their analysis. This typically requires a longer cycle time in the traditional approach when using a BI solution as there may be a need to re-execute the entire BI lifecycle process. As for the data marts and data warehouse, it needs to be remembered that those more formal data structures can be constructed as required. They can be derived and inferred based on the incremental units of work that have been developed and deployed using the agile approach and the deployed BI platform.

Deployment

Once the “Definition, Discovery and Evaluation” phase are complete it is time for deployment. The chosen BI platform should facilitate fine grain deployment to audiences of any size and support the social interaction of the audience as the deployed BI artifacts (reports, visualizations, etc.) are reviewed and discussed as the verification process commences. This further enhances and focuses collaboration which can lead to better decisions being made at a much faster pace.

The most important step in the above process is closing the loop between “Verification and Definition” (Re-Definition). It is imperative that this be performed as this is how adaptability is achieved because of constantly changing information and analytic requirements.

Summary

We briefly discussed some of the tenets of the agile methodology and how the BI platform can support the development of an agile BI environment. Agile BI is not just the implementation of the agile approach. The BI platform is also critical component that can lead to the success or failure of the implementation of an agile BI environment. Specifically, the BI platform should:

  • Leverage the concepts of agile development
  • Support integration of heterogeneous data sources at an individual, team or enterprise level
  • Be usable by information consumers and content creators with minimal IT intervention
  • Support the deployment of BI artifacts and social collaboration across multiple audiences of different sizes domain knowledge
  • Provide visualizations and analyses that facilitate discovery derivation of insights
  • Support rapid prototyping as well as enterprise level development

Finally, the deployment of an agile BI environment is a journey. It is continually evolving based on changes in information and analytic requirements, market conditions, organizational DNA and unforeseen future needs.

 

 

Posted in Business Intelligence | Leave a comment

Business Critical Line-Of-Business (LOB) Data in SharePoint 2013 – Using Winshuttle for no-code or minimal-code approach

Part 1 | Part 2 | Part 3

In this part 3 of my blog series, I’ll discuss a minimal code approach to integrate SAP with SharePoint. This approach would mean utilizing a tool or toolset to provide an end-to-end solution. I’ll further discuss how we can use the Winshuttle suite of products to enable integration of SAP with SharePoint 2013.

Using Winshuttle, one can create an end-to-end solution rather than just creating services that expose SAP data.   Winshuttle has products that allow you to connect to multiple SAP touch points such as SAP Transactions, BAPI’s, and tables and expose the data as a web service.   There are lots of tools in the market that allow you to connect to SAP BAPI’s and tables but Winshuttle is unique that it allows you to record entire SAP transactions such as MM01 to create a material in SAP.   This allows a non-technical person well-versed in creating SAP transactions, such as a Master Data Analyst, to create a powerful web service that exposes a complete business process in a matter of few minutes.

A separate set of products from Winshuttle then can be used to not only create an electronic form that can be used to consume the web services but also design the business workflow process using a designer tool.     In the end you can integrate most SAP business process without writing a single line of code.   For example, you can create a SharePoint portal that can integrate SAP Master Data with SharePoint and further allow users to participate in a workflow.

A vast majority of solutions can be implemented by someone who is well versed in the tool and has good understanding of SAP LOB data.  This is a significant benefit as it reduces the constant dependence on IT, and the ability to have more self-service type solutions makes business more agile.   As a result, the savings are significant as even the simplest self-service solution can save countless hours that would be spent otherwise in manually entering the data in SAP.

Now anyone who has been involved in creating SAP integration solutions for the Microsoft stack of products knows very well that no two SAP integrations are the same and that no product in the market will be able to solve all the business problems.   Ideally you want all your customers to build their requirements around the strengths of an enablement platform such as Winshuttle.   However in reality that’s not always the case.   Very frequently you will run into issues where a product at-least out of the box is not going to solve all your needs.

This is where Winshuttle is different.   Yes, it allows you to create moderately complex no-code solutions but for those advanced application-like scenarios, the platform allows more advanced users such as IT programmers to extend the solution.   The person implementing the solutions doesn’t necessarily have to be an IT Programmer but, it will certainly help to have such a person implement more advanced solutions as they will require some custom coding. As a result, extremely powerful SAP data automation solutions can be created using Winshuttle with some minimal coding.

Part 1 | Part 2 | Part 3

 

Posted in SAP Integration | Leave a comment

Line of Business Driving Agile BI

BI-infograhic3It is no longer acceptable to wait months to build a data warehouse, create reports or add new insights to a dashboard, the world and business are moving too fast. Today, customers already have solutions in place that they are using to provide insights to most of their questions. In fact 83% of data already exist in some reportable format. The rest of the information they are looking for also exists but is typically stored in local databases or spreadsheets. The goal of an Agile BI solution is one that can quickly connect the existing BI solution and these new databases or spreadsheets to deliver new insights.

OLD DOG, NEW TRICK

The traditional BI approach involves defining business requirements at the entity level, building and architecting a data model, configuring an ETL process and then building reports and dashboards over the data. An Agile BI approach starts with the data visualization and leverages existing data sources regardless of format or structure. It leverages the cloud for quick results and focuses on the user experience and collaboration.This Agile BI approach allows users to see results in day or weeks, not months or quarters.

BI-infograhic2We are not saying you should build your data warehouse on standalone spread sheets or databases, but as new insights are being defined and tested, a more agile approach is required to quickly validate assumptions and deliver results. In the last two years we have seen an explosion in the Line of Business driving business intelligence solutions. In most instances these people have a deep understanding of the data and they have already built interfaces or created extracts for the information they need. The goal when working with these LOB users is not to build the perfect data warehouse. The goal is to leverage what is already in place and deliver results. Most companies have all the traditional reports and dashboards already in place to run the business. The new insights LOB are looking for involve combining the existing solution with new data sources.

CONCLUSION

BI-infograhic1aCloud solution, Public and Partner Data Sources are offering a completely new opportunity to uncover new insights. Companies today are finding value by combing existing data with 3rd party or public data. Census data, retail data and consumer data all offer new ways to look at the business and deliver value. New data sources and tools offer opportunities for Line of Business users to validate assumptions and uncover business value by leveraging Agile BI.

Posted in Business Intelligence | Leave a comment

Design and document your RESTful API using RAML

Creating documentation is one of the more tedious aspects of the software development process. Not that writing algorithms and object-oriented class hierarchies isn’t tedious, but it often seems (at least to me) that writing plain text explaining your code is more tedious than writing the code itself. Also, if you consider yourself an Agile practitioner (as I do), you’re always evaluating the time spent vs. value added whenever you write much about working software outside of and separate from the working software you already wrote.

I find myself constantly analyzing and re-analyzing the darn thing. Is it too long? Too short? Too wordy? Oversimplified? Too granular? Too vague? Should I have even attempted this?

I don’t claim this conundrum is unique to me. It’s one of the main reasons that many software projects are undocumented or under-documented. Because of this, a tool that helps you write clear, simple documentation, and saves you time doing it, is an invaluable find.

RAML is one of these fabulous creatures.

I was introduced to RAML by a colleague while working on an ASP.NET Web API project over the the last few months. What I like best about RAML is that it solves multiple documentation problems through the creation and maintenance of a single file:

  • You can design and modify your API collaboratively without writing / re-writing any code.
  • The syntax is simple and to-the-point, reducing time needed to properly describe the routes, actions and parameters.
  • This document serves as a team “contract” of sorts throughout the development process. Server-side developers, client-side / mobile developers, and testers – everyone – can see and agree on the intended behavior and content of your API requests and responses.
  • When the project is done, you’re left with an excellent artifact to hand off and / or publish describing exactly how your API works.

Now that’s what I call valuable documentation. To get started, all you really need to write a RAML doc is a text editor and access to the spec. However the team over at MuleSoft has developed a superb browser-based editor for RAML, complete with spec references and “intellisense” of sorts. Best of all – it’s free to use. They’ve labeled it as “API Designer” on their site; here’s a sample (click the image for a better view):

api_designer_screenshot

I used it for the project I mentioned above; I found it to be far quicker than writing a text-only version. Being able to click on and view the routes on the right side of the screen was also a really handy feature during various design and testing discussions. I found the API Designer to be a solid tool, and highly recommend it.

So the next time you kick off a REST API project, *start* with the documentation – and write some RAML.

Posted in Mobile Solutions, Web Development | Tagged , , , , | Leave a comment

Mobile First – Cloud First Simplified (3 of 3)

<-Previous post for this series

The conversation about business outreach (service first) and infrastructure elasticity (cloud) does not feel complete without including…

Big Data

Every generation of technology upgrade has created a need for the next upgrade in some ways. User created content and social media initially drove the need for big data techniques. However, the drivers to this movement added up pretty quickly because what big data analysis and prediction can do for business was quickly understood.

A Quick Introduction

Big data is commonly referred to as the mining and analysis of extremely high volumes of data. The data in question is not structured since it is collected from a variety of sources. Many such sources might not be following any standard storage format or schema. The data in question is also described by its characteristics, primarily – volume, verity, velocity, variability and complexity. The techniques for analyzing this data involve algorithms that engage multiple processors and servers that distribute the data to be processed in a logical way. Map-Reduce is one of the many popular algorithms, and Hadoop is one of the popular implementations of Map-Reduce.

Big data techniques are not something that only the corporations that collect social media dust need, it is something that every business needs to look into, sooner than later. It is a separate topic that every business needs to factor in social media data in some form or the other. Even if that part is ignored, the volume of structured data is increasing by the day.
BigData

Keeping all that in mind, it should be important to explain how well big data fits into the elasticity of the cloud. Imagine an operation where data needs to be segregated by some specific parameter on different servers. These different servers might run some processing depending of the type of the data or just store that data to improve access time. A true cloud environment will be the perfect host of such an operation. You can spin up new servers, having specific configuration, with just a few lines of scripts at run time.

Where are we heading

In 2011 Google Fiber announced Kansas City to be the first to receive 1 Gigabyte per second internet speed followed by Austin, TX and Provo, UT. As per the company’s announcement in February 2014, Google fiber will be reaching to another 34 cities. AT&T stepped up by announcing that its new service, GigaPower, will provide the gigabyte internet speed to cover as many as 100 municipalities in 25 metropolitan areas. Besides Google and AT&T many other large and smaller providers are working on targeted areas to provide superfast internet speed such as – Cox, Greenlight Network, Canby Telcom, CenturyLink, Sonic.Net etc.

Considering this new scale of data on bandwidth, the way application technology works is going to change, specially the part that involves Mobile and Cloud. It will be much more convenient to have a huge memory and processor centric operation running in a cloud environment, streaming status and results to the browser running on your laptop or a small hand held mobile device.

Moving the heavy lifting work to the cloud and keeping the control on low resource devices is not something that is going to happen. It is happening now; only the scale and outreach is going to increase exponentially. Everyone connected to this field should to pay attention to the changes and keep a strategy for the future, be it providers, consumers, decision makers, technology workers, business users and consultants.

Mobile first cloud first simplified.

Posted in Cloud Computing, Mobile Solutions | Tagged , , , | Leave a comment

Enterprise Social – We are the Champions!

We talked in the last blog, Countdown to Launch, about identifying Champions to manage the communities in your new social network. You can’t do it alone! A successful launch requires a team of people. We know that we are looking for committed people who have responsibility to get the word out and utilize the social network we are looking for, but what types of people are these exactly? Among other things…they AIM. AIM? AIM for what? They may be “aim”ing to have a successful social platform, but really I mean Act, Inspire, and Maintain.

Act

In sports, a champion (with good sports conduct) takes success and defeat in stride, keeps their goal in mind and never stops working to achieve it. Your champions should be the same. They must DO to see results. There will be times where it may seem that their efforts are not making as big of an impact as they hoped, but they need to remain committed. Their actions will set the tone for the rest of your users, their actions will drive other actions, and they will be role models for this social network. Their commitment to keep pushing for success is key.

Social_Take_Action_Sign_22

 Inspire

All of us have people in our lives who have made an impact, positive or negative, but enough for us to change our minds, direction, or attitude because of them. I am not saying that your champions need to be the next Gandhi, but I think there are a few key traits that inspirational people have.  Inspirational people are….

  • Proactive – they take initiative to get the job done and don’t wait on someone else
  • Passionate – no one’s going to buy into what you’re pushing, if you’re not sold yourself
  • Tactful & Diplomatic – people aren’t going to listen if you’re message is one-sided, negative or delivered rudely. Having tact and diplomacy will help your message be heard even by the skeptics
  • Creative – often time inspirational people are creative. Creative in the way that they present their ideas, go about completing their tasks, or solving their problems. Creativity is intriguing; a little creativity can go a long way to gain some interest.

Maintain

Being a champion isn’t all about the glory, it takes work. Your champions need to be willing to put in the time it will take to be successful. The work does not end once there are users on your social platform, the job is ongoing. Champions will need to maintain content, continuing to push content, encourage others to do the same, and remember to bring social into the arena when others may not be thinking about it. They will also need to help maintain the community of users who are on the network. What if they have a question, need help, guidance, or are looking for tips on starting their own community? The champions are their guide.

Are you a champion? Have you identified some names in your organization that might fit the champion role? Now we understand why we need the champions, who we are looking for to be champions. Next time we will be talking about some tips for your champions to succeed in driving social. In the meantime, maybe play a little Queen for your team to get them motivated, “We are the champions, my friends…”

Posted in Enterprise Social | Tagged , , , , , , , , , , , , , , | 1 Comment

Predictive Analytics for the Masses with Power BI and Office 365

Power BI is Microsoft’s cloud based service that leverages Excel to enable self-service business intelligence. The term Power BI has also been used generically to reference the components and technologies that compromise the Microsoft BI suite of tools.  Specifically, PowerPivot, PowerView, PowerQuery, PowerMaps, Question & Answer (Q&A) and now Forecasting.  The Q&A and Forecasting features are currently supported only in Office 365 and SharePoint on-line.  The other features are fully supported in the desktop (Office Professional Plus) and Office 365 versions of Excel 2013.

The latest incarnation of Office 365 implements time series analysis to provide forecasting capabilities.  It is this version and its forecasting capabilities that will be discussed in this article.  The description and definition of the specific time series algorithms related to forecasting  is beyond the scope of this discussion but, the implications of providing this capability are not.

The methods and techniques for time series analysis are well documented and understood in academia and in the field of statistics, but now this capability is being placed in the hands of the masses that may or may not have a thorough understanding of the associated techniques or how to interpret the results.  This may present a change management issue for an organization, but with some planning a great deal of benefits and insights can be obtained that would otherwise not be realized.

From a change management perspective, it is imperative that a consistent approach be defined and implemented to ensure consistent results when developing an analytics solution.  This should also include a training program on terminology,  techniques, methods and practices.

Let’s take a detailed look at the process that will lead to obtaining useful insights from a forecasting exercise and then how this process applies to an example implemented in Power BI.

Process Flow

  • Business Understanding – Understanding from a business perspective of the project objectives, requirements and what the specific outcomes should be.  This may also include an initial reference to an analytic methodology or approach (forecasting, classification, etc.).
  • Data Understanding – Understanding of the traits/personality (profile) of the data.  Are there data quality issues? What are the valid domains of attribute values? Are there obvious patterns?
  • Data Preparation – Does the data need to be reformatted?  How will missing values be handled?  What are the relevant attributes or subsets of data?
  • Modeling – Identify potential modeling techniques to meet the requirements of the business solutions and its objectives.
  • Evaluation – Evaluate the model and determine its fitness for use.  How accurate is the model?  Does it address the business requirements?  Have new insights been exposed that change the understanding of the data?
  • Deployment – Present the model results.  Make sure the appropriate visualization is used to present the results.  Does the deployment require a simple report, or is a new process required to closed the analytic loop?

This process is depicted in the following diagram:

image

 

The above process steps define the CRISPtm  data mining methodology which provides an excellent foundational approach and process for development and deployment of predictive analytic and data mining solutions.  It has been around for some time, but the basic tenets are very applicable.  Let’s now look at an example of how Power BI forecasting can be leveraged and how the process steps are implemented.

Example

The following data represents new and used car sales from 2002-2014.  The data is stored by month. Examining the raw data, this is the opportune moment to address business understanding and identification of the business problem and requirements.  In this case the business problem is to forecast future new car sales to help better manage inventory. Also, understanding the nature and characteristics of the data should be accomplished at this point.  This can be done via data profiling (min, max, null counts, standard deviation, etc.) and through data visualization.  It would also help to have a domain expert available to provide additional insights.   With regards to data preparation for Power BI Forecasting, there should be an attribute that can be used for time series analysis.  In this case, a new attribute is created named [Period Ending] that is a combination of the [Year] and the [Month] represented internally as a date.

 

image

 

The above data was loaded into a PowerPivot workbook and uploaded to Power BI where some visualizations were applied.  The line chart shows new car sales units over time.  This line chart will be our candidate for time series analysis (forecasting).  Note that there appears to be a cyclical pattern in the data.  This is a good reason to generate a visualization to provide insights into the nature of the data.

image

Currently, to perform forecasting, Power BI must be placed in HTML5 mode.  This is accomplished via an icon in the lower right corner of the web page. Once that has been done, then hovering over the chart will expose a caret that indicates forecasting may be performed.

image

Clicking the caret produces a forecast and displays an additional panel that contains adjustable sliders for confidence interval and seasonality.  The forecasting algorithm will attempt to detect seasonality and display the calculated cycle in terms of units. The seasonality slider allows for manually setting the number of periods over which cycles will repeat.  For example,  If  based on domain knowledge,  it is known that the seasonality is different from what is calculated then it can be adjusted accordingly.  This may change the forecasted values.  In this case, the seasonality is detected to be 12 units (1 year).

 

image

The confidence interval slider displays a shaded area that indicates the number of forecasted values that fall within a specified number of standard deviations.    If there is a need to have a very high correlation for forecasts, select one standard deviation.  This will also be an indication of how well the forecast model fits the data.  The nature and requirements of the business problem and the user will determine an acceptable value for the confidence interval.  For this data, 68% of expected values fall within one standard deviation.

image

There is also the ability to perform a hindcast.  A hindcast produces a model that uses historical data to predict future values based on a preceding selected point in time.  New predictions are generated that show how the current predictions would look if the prediction was generation at some past point in time.

image

Prior to this point, the appropriate model would have been selected (time series) and the model applied and evaluated.  Within Power BI, the option to select a specific time series model is not available.  With regards to model evaluation, adjustment of the confidence interval and hindcasting provides the ability to evaluate the overall fitness of the model.

Finally, the model is deployed and can be used for revaluation.  This can be done via exporting the model along with its data to Excel and running it back through the forecasting model again.

It has been demonstrated how Power BI forecasting can be leveraged using the CRISPtm methodology and how advanced analytics can be placed in the hands of the masses. Power BI as a solution is simple to understand, uses existing technologies  and is straightforward to implement.  Over time, more and more advanced analytic capabilities will be exposed to the masses and to be successful, a well defined process, approach and appropriate training must be used to ensure that proper results and insights are obtained.

Questions and comments can be addressed directly to:

Ray Johnson

Director, Data Management – Strategist
e-mail: ray.johnson@mpspartners.com

LinkedIn: http://linkedin.com/in/rayjay

Posted in Business Intelligence, Microsoft | Tagged , , , , | Leave a comment

Enterprise Social – Countdown to Launch

So you have support and buy-in from the powers that be at your company to launch a social network, you have the platform setup and you’re ready to “Go Live.” The platform will be ready tomorrow, so you will launch next week, but are you really ready? Do the users know about the platform? Will they use it? Who are your social champions to help spread the word? Let’s take a step back and look at the preparation that goes into a successful launch.

 1. Pre-launch Communication
No one sent you an email and asked you to join LinkedIn or Facebook, but you must have heard about the platform from somewhere or you wouldn’t have known to join. The same goes for your social network. Get the word out! Communicate with your users that social is coming to your workplace and this tool will be available to them soon. Communication can be via email, but we all get so many emails in a day, many of them get overlooked. I would instead suggest posters,

table tents or even a game. Post teasers, to drive interest like “A new way to communicate…”, “Breaking down barriers…”, etc. then enhance the message to include “with social” or “with yammer”. social_whatwhenwhere

This will drive some buzz and curiosity about what’s coming. Awareness is half the battle of a successful launch and long term adoption.

2. Write a Usage Policy
Usage policy is kind of a strong term, for what we really want to create here. I have heard many clients tell me they don’t want social because they can’t govern it. I have two things to say about that, 1. There are ways to govern these platforms, 2. You shouldn’t have to. Your users have access to e-mail, phones, laptops, and other devices that allow them to communicate with each other, your external partners, and access company information. There are also policies around what they should and should not be doing and saying. The same applies to social. These policies apply, just because there is a new communication channel, does not mean that your employees are suddenly going to toss the rules out the window when they join your social network. Instead of a usage policy, put together a social etiquette document. Remind users that the same polices that govern other areas in the company still apply, but more importantly tell them about the purpose of your social network. Whether it is intended to connect with external partners or provide a space for project team collaboration.

 3. Identify Social Champions
Social grows organically, right? Facebook was created, people heard about it, people joined, and it grew to be bigger than anyone had ever imagined. That’s how social networks are supposed to work, and this is true, BUT if people joined Facebook and then no one posted anything, and everyone was just an information consumer do you think it would’ve succeeded? Probably not. Your social network is no different.

social_evangelist We need people who are going to champion this network. People who are committed to making this initiative a success. As people join they will start to utilize the platform, just like people did with Facebook, but someone has got to start the movement. Your identified social champions (or community managers) are responsible for pushing the social platform in places where email or another tool may have been the solution before. Write posts, respond to questions, and help users understand the benefits of the platform.

 

4. Create Network Structure
Your file shares have structure, and so does your document management system, your social platform is no different. Social has some level of organization just like everything else. What networks will you create to start? Will they only include internal users or will external users be allowed access? What groups will be created in your network? Will they be organized by subject matter, project, working teams, or something else? Your entire roadmap doesn’t need to be defined to launch social, but you should have an idea of what your network will look like. After you write your social etiquette document, you have identified the purpose of your social network which should help you guide how content should be stored. Know that your users are going to grow this network, and more than just the admin will likely have access to create new groups, but setting the framework for organization will help your users understand how content should be organized.

Your social launch is important, it’s going to drive the success or failure of your platform. This is a critical time for initial adoption and finding those who will champion this new tool. Next we will talk about what your social champions will need to succeed.

Melissa McElroy
User Experience & Social Collaboration Evangelist – Senior Manager
e. melissa.mcelroy@mpspartners.com
LinkedIn http://linkedin.com/in/melissamcelroy

Posted in Enterprise Social | Tagged , , , , , , , , , | 1 Comment

Mobile First – Cloud First Simplified (2 of 3)

Continuing the simplification of mobile first, cloud first from the previous post…

Let’s highlight the two big objectives that are achieved by separating core business services and platforms specific client in the last post:

  • Platform and device outreach – HTTP being understood by all modern devices, makes your service consumable by any device that can host a client application and understand the language of the web.
  • Heavy lifting done on the server – With the separation between a client app and business as a service running on a server somewhere, all the heavy lifting is done on the server where as the user’s device is doing mostly the user interaction. The heavy lifting work is generally referred to complex computations that consume a lot of hardware resources like CPU, RAM which is generally a limitation on small mobile devices.

Now let’s talk a little about the server.

 

Is your application business ready or feature ready?

So now we have built our application in a restful manner to reach a broad spectrum of devices and we moved the heavy lifting on the server. At this point our business idea can either take off or send us back to the drawing board. In either case the load on the server that is doing the complex operation is going to fluctuate.

The question here is – is the application infrastructure elastic enough to support that… or is such increase and decrease in the infrastructure going to come with a heavy cost?

It is a difficult question to answer for any developer – how many users (or traffic) would the current server infrastructure be able to hold? The best answer that you will get would be a very careful calculation based on perhaps stress testing, overly padded with seasoned wisdom. In fact, in case of a new application or a rewritten application with the newer frameworks, it is very difficult to evaluate the ideal infrastructure requirement until the rubber hits the road. To be on the safer side, every team tends to overestimate.

 

Cloudy with heavy awesomeness

Moving the infrastructure to cloud will help you achieve such elasticity. You do not need to worry about contacting data centers really, you can spin off new servers and shut them down when not needed, using a few lines of scripts. Depending on the service you are using, you could do many infrastructure operations using a self-service portal and be charged for only the infrastructure you use, for the duration you used it.

Suppose after we launched our application, we found that out target customers are in a specific geographic location like the east coast or some other part of the world that our analysts never imagined. Can you quickly respond to the new found opportunity? Most cloud service providers will allow you to select the geographic location of your infrastructure, allowing to place more servers closer to the customer for optimized user experience.

Global cloud providers are large organizations that have heavily invested in the infrastructure over the years thus providing you high security and availability. Therefor, there are many benefits that your business gets by moving to the cloud that might be difficult to estimate beforehand.

Next post of the series ->

Posted in Cloud Computing, Mobile Solutions | Tagged , , , , , , , , | Leave a comment

Enterprise Social – Continuing to Engage with Business Integration

How many systems do you access in a day to get the information that is critical to accomplish your job? I can think of a MINIMUM of four without even spending more than a moment. I navigate between systems to get communication from my co-workers, clients, and to review information. How much time is spent flipping between systems to get to this information? Think about the amount of time it takes for you to log in to each of these systems to review, create or update the information and make business decisions. Wouldn’t it be nice if the business critical information was at your fingertips? It can be, with social integration.

  1. One location means less places to hunt for information
    We effectively only work 3 days a week. Statistically, we spend the fourth day searching for information and the fifth day inefficiently dealing with data, such as reformatting, updating, versioning, copying, pasting and getting data out of one application to put it into another.[1]

    Integration with social can help reduce both the searching for infomration as well as moving information between systems. Social feeds can be integrated with ERP systems to display important transactions, reporting tools, and CRM systems. social_calendar_waste

    Instead of opening Salesforce to view the latest sales information, new sales leads can feed into your social news feed. Not only will the need for additional licenses be reduced, but in addition to your sales team knowing what’s in the pipeline, your sales support members are also in the know before they get that invite to help with and RFP they didn’t know was in process.

  2. Sharing Knowledge – Information that may once be unavailable is now sharable
    Consider this…you have a question on a report that was recently released via email and you respond via email to a few key members on your team that you think can help and wait for a response. Instead, you reply to the feed post with this report with this same question, mentioning those team members that you think can help with your question. Not only do those users see the post, but anyone following them does as well. You have now increased the knowledge base that you are tapping for the solution. In addition, you may have now helped someone who had the same question.
  3. Driving users to social – Social is more than just posting & hashtags
    I often get the question from our clients about what the point of social is. Isn’t it just posting random stuff? No, it’s not. Social can be successful in your enterprise environment. Integration with your other systems brings social to light as a business tool and not just a “social” tool. Your business information can benefit from the social features however. When that report shows up in the newsfeed, it can be tagged with topics like the system it came from and the topic of the report. People who are following these tags will instantly have the report show up in their feed. Also, when people are searching on these tags, the report will be returned in the results. It’s social metadata if you will.
  4. Keeping up to date on other information because they’re going there for their job
    You integrate your first system, let’s say CRM, with your social platform. You have told the sales team that they will get their sales updates via the newsfeed and not an email. The team begins the transition to going to the newsfeed regularly to get their information, when they may not have in the past because they had no reason to. Now, while perusing the feed and looking at the information tagged with #Sales, they also notice comments about a recent meeting a partner had with a client that may turn into a sales lead and a project that is ending at another client that may turn into another phase. The sales team may not have known about these events and there would have been missed business opportunities.

So how do you get started? What systems do you integrate first? My suggestion here around getting started is think about who in your organization has started to adopt the social platform the most, but also who hasn’t. If there is a team in your organization that you think has missed opportunities from not participating, maybe it makes sense to include some of their data. Drive them to the platform with their business critical information, and allow them to reap the benefits of the other information that is shared there.

Melissa McElroy
User Experience & Social Collaboration Evangelist – Senior Manager
e. melissa.mcelroy@mpspartners.com
LinkedIn http://linkedin.com/in/melissamcelroy

[1] Feldman, Susan, Hidden Cost of Information Work: A Progress Report,‖ International Data Corporation (IDC), Framingham, MA, May 2009.

Posted in Enterprise Social | Tagged , , , , , , , | Leave a comment
 
View By Author
  • View By Category
View By Date
Part Of SPR COMPANIES
© 2014 SPR Companies. All rights reserved. About | Capabilities | Products | Solutions | Clients | Events & Resources | Careers | Site Map | Legal/Privacy