In this blog series we will be discussing Usability and User Experience, what they are, what they are not and why they are so important. Usability and UX are not just for externally facing customer sites, it is important to apply these concepts to your internal sites and applications as well, employees are users too! After we have a good understanding of the terms, we will continue on to look at these topics in respect to SharePoint mobile interfaces. Let’s start with understanding usability.
Usability – Is ensuring that the user can complete the desired task. Simple, right? It seems simple enough, but far too often interfaces are confusing! Information is not organized in a manner that meets the user’s expectations or mental model (how we expect something to work based on prior knowledge), information is hard to find, and feedback provided by the interface like error messages and prompts are not useful or intuitive. If a user is having difficulty getting signed up for a service, they may go elsewhere because it is just too difficult. Users have enough going on in their lives, the systems they use should simplify tasks, not make them harder.
I had an issue on a site the other day and I won’t divulge what site I was on, but I will say it’s a well-known business journal and a subscription is needed to view their digital media. I entered in all of my information, clicked next, entered my payment information, and clicked Submit. I received….
Error - This email address already exists.
Ok…..so did you just charge my credit card? Can I login? I’m confused….so I go to login. The site tells me that my account is not activated and that I need to confirm my email address via a link they’ve sent…but you said it already exists and now it’s not setup? I’m even more confused. Needless to say, I never signed up for the subscription and got my information elsewhere and the company… well they lost out on a sale. So, let’s talk about how this can be avoided….
What a user “Can Do”
Usability is all about what a user can do with an interface. Not, can they click on the logo to go home or a link to read an article, all of that is functionality, usability is can they complete the tasks that they expect to be able to accomplish. Can they find what they’re looking for in the navigation bar(s) and search? Can they easily find information when looking at a product page? When populating information, is it clear what fields are required and what information and in what format it should be entered?
If a user cannot find something, then to them, it doesn’t exist!
Put simply, users should not need training to interact with the interfaces. That is not to say that a user doesn’t need training to understand a business process, like how to do month end for accounts payable. When a process is understood, the interface that is utilized to complete or retrieve information about that process should be intuitive for simple tasks and provide cues to guide the user for more complicated tasks. Have you ever been trained on how to use Bing? Or an online banking site or app? Probably not, and I bet that you use these products often, multiple times a week even and can successfully accomplish your tasks.
When a user interacts with an interface they should be able to remember how to complete the process. Users should not be expected to have a manual in front of them to use an application. Helping users remember can be accomplished with information chunking, implementing functionality that follows the user’s mental model, and using best practices and standards that are put in place such as common icons and colors. And just in case they can’t remember, there should be on screen help or the ability to search for help information.
The handles imply pull to open, the mental model of how the door should function was not followed, so cues had to be added.
Interfaces should allow users to complete their tasks in a timely manner, understanding that some processes just take longer, but are there ways the process can be better? As important as optimization is, there is another factor related to efficiency and that is how many errors the user encounters in the application. I am not just talking about bugs in the code, where an error is handled. I am referring to being able to clearly understand what is being asked of them so they have less errors when populating information, being able to click on the right link because they are properly spaced, and being able to select the right information because the labels make sense.
The value? Which value do I need to correct? The necessary information is not provided for the user to easily correct the error.
Design for your Audience
Always keep in mind who is going to use the application when designing. A children’s website will be designed very differently from a college information database based on the knowledge of the user as well as the intent of the site. Keep in mind what is important to your audience on the page, their level of expertise on the topic, and the mode and environment in which they will be accessing the information. Creating personas before doing design work is one way to ensure that the users are kept in mind throughout design.
In addition to having satisfied users and customers, usability can save and make you money. Money can be saved by making processes more efficient, eliminating unnecessary work, and decreasing training costs. Revenue can be increased by re-designing to reduce drop-off rates and increase conversion rates. Checkout Human Factor’s ROI calculators to better understand how your bottom line can be improved with usability.
So…How do we accomplish this?
First, talk to the users. The users are the ones that hold the knowledge on how to make an application successful and we can learn a lot from them through interviews, focus groups, surveys, and other analysis techniques. After we learn from the users, put together user personas and use cases, generate the design, test with the users and optimize as necessary, develop, test, and optimize some more. When the users test an application, there will be items that come up that were not thought of during design. It’s ok, that’s why we test and optimize in an iterative fashion.
Now that we know a little bit about the important factors in how to ensure the user CAN use the application, in my next blog we will talk about the user experience and what the user WILL do in an application.
Question and comments can be addressed directly to:
In Part 1 of this 3 part series on Power BI, I discussed the tenets of self-service analytics and how Power BI could be an enabler. In part 2 of this series, the focus will be on the specific capabilities of the Power BI platform. How to perform specific tasks with Power BI will not be addressed, only the capabilities and potential of the overall platform. Part 3 of the series will discuss and review deployment scenarios.
As a review, the major components of Power BI are: Power Query, Power Pivot, Power View, Power Maps (available as Excel plugins), Question & Answer (available via Office 365) and SharePoint as the delivery mechanism. Each of the components is designed to complement the other components of the Power BI platform. From a data management perspective, the components can be categorized as follows:
From a process perspective, the components of the Power BI platform build upon one another in the following manner:
The above process flow serves to illustrate how the components of the Power BI platform may be used to build and deploy a solution. Now the question is what are the capabilities of the components and how can they be leveraged.
Power Query – Data Discover & Transformation
The first questions that are asked when building an analytics solution are what is the goal/purpose of the solution, what data is required and where does the data exist? Power Query provides the capabilities to address these questions.
With Power Query, multiple internal or external data sources can be imported and subsequently transformed to fulfill the defined solution requirements. Query results are stored as tabs within an Excel Power Pivot workbook. The data sources may come from standard sources such as databases, files and web services. Additionally, data may come from external sources such as web pages, and registered data providers such as the Azure Data Marketplace or even Wikipedia.
This provides the capability for data enrichment that has not been readily available in the past. An example would be supplementing internal customer address data with web based public data on average household incomes. Furthermore, transformation rules can be generated that filter or augment the data prior to final analysis. The transformation rules can be saved for future use when the data is refreshed. The following figure illustrates the process.
For registered data providers, Power Query can facilitate data discovery using simple queries such as “Average home selling price in region ABC”. Using this capability Power Query presents samples of candidate data. Based on the samples, the most appropriate can be selected based on requirements. Power Query will then retrieve the associated data, making it available for transformation and analysis.
The following figure shows an example of an online query performed for the “mean income for California counties”. Note that Power Query presents multiple options. A preview of each data source is displayed as it is selected. Once an appropriate source has been chosen, it can be imported into Excel.
Figure 1: Power Query online search
Here is the data that was retrieved
Figure 2: Power Query results selection
Next the data is filtered by Los Angeles
As previously stated, data retrieved via Power Query may be stored as a tab in the associated Excel Power Pivot workbook where further data integration and consolidation may be performed.
Power Pivot – Data Integration and Consolidation
Once data has been identified, it needs to be integrated and consolidated for further analysis. Power Pivot can be used for this task. Power Pivot does not provide some of the data discovery capabilities of Power Query but, it can leverage Power Query data as well as integrate data from other sources. Power Pivot overcomes the data volume limitations of Excel by allowing for the analysis of millions of rows of data from multiple data sources. This is accomplished via a column oriented in-memory analytics engine and data compression.
Each Power Pivot source is materialized as a tab in an Excel Power Pivot workbook. Relationships can then be established among the data sources using common fields. This will require some level of planning and design to ensure that relationships do exist among the data. In addition to defining relationships, Power Pivot supports the Data Analysis Expression (DAX) Language which is used to define calculations and perform data manipulation and analysis. It is possible to define calculated columns and fields that are context aware, key performance indicators and hierarchies. The following figures illustrate how data sources and data relationships are materialized in Power Pivot.
Figure 3: Power Pivot Data Sources
Figure 4: Power Pivot Data Source Relationships
It is also possible to create perspectives (views) of Power Pivot data that may provide a limited subset of fields available in the Power Pivot model. This would be useful to design functional views of the data (Sale, Operations, Finance, etc.)
Power Pivot has an additional component that can be integrated into SharePoint. In a SharePoint environment, Power Pivot query processing and data refresh for published workbooks is enabled through Power Pivot server components that are available in SQL Server 2008 R2 and later. The Power Pivot for SharePoint feature provides services, a management dashboard, library templates, application pages, and reports for using and managing server software. Power Pivot server components are fully integrated in SharePoint. SharePoint administrators use Central Administration to configure and manage Power Pivot services in the farm. Power Pivot workbooks stored in SharePoint are stored in a Power Pivot gallery that provides previews of the contents of Power Pivot based Excel reports.
Power Pivot workbooks can serve as data sources for Excel pivot tables, SQL Server Reporting Services (SSRS), Power View/Power Map visualizations and other Power Pivot workbooks. Power Pivot workbooks also form the foundation for Power BI Question & Answer functionality.
Power View – Data Visualization
Power View facilitates data visualization. Visualizations are developed in Excel and leverage an underlying Power Pivot Model, Tabular Mode SQL Server Analysis Services (SSAS) instance or more recently, a Multidimensional Mode SSAS instance.
Power View allows for the creation of highly complex and interactive visualizations. The associated model (Power Pivot, SSAS Tabular or SSAS Multidimensional) provides the context that drives the visualization.
Several Types of visualization are available in Power View in the forms of:
Tables – Flat Table, Matrix (Pivot Table), Card (Data visualized as a note card)
Bar Chart – Stacked Bar, 100% Stacked Bar, Clustered Bar
Column Chart – Stacked Column, 100% Stacked Column, Clustered Column
Maps – Bing Maps
Other – Line, Scatter, Pie
Slicers – Objects which filter data
Tiles – Provide filtering capability and facilitates nesting of objects
Within Excel, each Power View visualization resides on a separate tab and may contain 1 or more of any of the aforementioned visualization objects. This provides flexibly in creating visualizations that are only limited by one’s imagination. Objects may also be nested. The following figure illustrates some of these capabilities:
Figure 5: Power View Visualization Objects
The above figure depicts a tile object that is filtering on “Bikes”. The Tile object contains a table, column chart and bar chart. On the right hand side of the image is displayed a context sensitive menu that contains formatting options for the selected visualization object.
Power View visualizations can be developed in Excel and uploaded to SharePoint for distribution. Furthermore Power View visualizations can be created directly in SharePoint. When created directly in SharePoint, the Power View visualizations may be exported to PowerPoint with their full interactivity.
Power Maps – Data Visualization
Power Maps extend the data visualization capabilities of Power View with a focus on Geo-spatial data. Data can be visualized as column, bubble, heat map or regional (used with country, region, postal codes) charts with multiple map layers, annotated with text or overlaid with two dimensional charts. Data maybe mapped using standard country, region/province/state, postal codes as well as longitude and latitude. Data to be mapped can be provided by Power Pivot or individual worksheets within an Excel workbook.
A Power Map visualization within an Excel workbook is called a tour and each tour contains individual sheets referred to as scenes. This provides the ability to map multiple areas or scenarios within a single Power Map visualization. Additionally the tour and its associated scenes may be played back and captured as video or screen shots for distribution.
One more dimension of a Power Map visualization is time. A time attribute may be associated with data in a Power Map visualization so that the changes in the data values may be displayed over time. Time can be specified as a day, month, quarter, or year. The associated values may be displayed as point in time, cumulative or static until replaced. Using this feature with the ability to record and playback Power Map tours provides significant visibly and insight into what the data visualizations represent. The following figure shows a Power Map tour and scenes.
Figure 6: Power Maps Visualization
The left side of the visualization displays the Power Map tour and its associated scenes. The tour name is “Product Line” and the scene names are “Product Line Sales” and “Product Line Margin” respectively.
The mapping mechanism for Power Maps is Provided by Bing Maps. This means it is possible to zoom in/out, tilt/rotate and display the visualization in a globe or projected (flat) view.
Self Service BI – Question & Answer
There are times when data is available but not appropriately formatted for presentation or there may be ad hoc questions that need to be asked. This is where Question & Answer (Q&A) becomes relevant. Normally one would just consume a Power Pivot workbook or Power View/Power Map visualization as-is. Q&A moves beyond that. Q&A can be used to query existing Power Pivot workbooks as well as data that is on premises.
Q&A is a component of Power BI that is hosted via Office 365. Data sources can be queried using natural language queries within the context of the data selected data. An example of such a query would be “Total sales for product line 123 and in the United States”. Q&A parses the query as it is entered and presents its version of what it considers is being asked. To best take advantage of Q&A, a vocabulary of synonyms should be established to tie business concepts to the underlying data.
If the same question is asked regularly, then that question may be saved as a featured question in the Q&A catalogue. Also, Power Pivot workbooks can be saved as favorites and presented as featured workbooks for analysis (see the following figure).
Figure 7: Q&A Featured Questions & Reports
Q&A will determine the best way to display query results based on what s being asked. Results may be presented as a single value, table, chart (bar, line) or a map. If required, an option exists to override the query results display format (see the following figure).
In the figure below, underneath the entered query is the Power BI interpretation of the query. On the right hand side of the visualization are the options to change the visualization display format.
What has been presented thus far is a high level overview of Power BI capabilities with the intent of providing ideas as to how it can be deployed as a component of a self-service analytics strategy. With planning and understanding of these capabilities, it is possible to develop and deploy highly interactive, functional and complex analytics solutions that can meet the needs of multiple business roles and scenarios.
If you are working with SharePoint 2013, you might always wonder how nice it would be if you could use OOB “Get stared with your site” tiles for your web parts, application pages, apps, lists etc. If you are wondering how to do this, you are in the right spot.
OOB Get Started With Your Site Tile:
Let’s get started then
- Go to Settings >> Site Contents
- Click on add an app, then search for “Promoted Links” app.
- Once found click on it to add/create your own promoted links list.
- Give it a name.
- Navigate into the list by clicking it (obvious!)
- You will see “The list is empty. Add tiles from the All Promoted Links view.”
- Click on “All Promoted Links” hyperlink.
- Start adding your links here.
- Here is a my sample link
- Order: The tiles in the sequence you want them displayed.
- All fields are self-explanatory.
- Note: adding/editing the list items I found “All Promoted Links” view makes life easy.
- Once you are done adding the links, go back to the page where you want to display them.
- Click edit page
- Navigate to Insert section
- Click App part/webpart then select your list, save page and browse!
The Final Results:
Continuing from my previous blog on WABS Patterns, we will implement another enterprise integration pattern in this series.
We will continue utilizing the WABS features set that are available in the current version. As and when new bits come out, our blog series will be updated to reflect the changes.
The patterns that we will address in this post will be Dynamic Router.
For this pattern each message recipient will define the rules on the type of the messages it can handle and process. The message will be routed to the right message recipient when the message parameters are matched against the rules.
For Windows Azure infrastructure, Topic and the Queues provides this functionality. We will use Azure Topics and create subscriptions and extend that by including the custom code in Bridges.
Contoso accepts orders for multiple brands of products they sell from many different partners across the country and around the globe. Contoso partners could be sending order data either via FTP or via web-services. Along the way the message must be dynamically routed based on the metadata information. Their order fulfillment is often based on the region for logistics and distribution purpose. This is done by looking up the data value from the message at runtime and promoting the properties to Brokered Message using custom code in the bridge for routing messages. Since the message are promoted at runtime dynamically, from within the bridge there is no need to manually set the properties and it’s value at design time. Once the order is dynamically determined, all orders are placed in separate holding areas for further processing by the back end line of business system. Each regional fulfillment center will then subscribe to these order messages and process them appropriately.
BizTalk Server Approach
To implement this pattern as BizTalk Server Solution we should
- Create schema to receive the message.
- Orchestration will subscribe to the message
- Orchestration will promote the message properties, in our case Region
- Message properties specific to routing (like address and transport type) will be set on Dynamic send port in orchestration
- Dynamic send port will route the message based on the transport protocol and the address set
WABS Integration Services Approach
The figure below represents a BizTalk Service Project.
Just like BizTalk Promoted properties, we do similar processing with Brokered Messages objects that contain the properties collection for routing the messages. As can be seen, we have Bridges that allow input into our process:
OrderMetaData is an XML Bridge that is surfaced through an FTP Source. The source allows a pull from an FTP site. FTP source pulls the Order data with some metadata.
Figure below shows the OrderMetaData configuration
Message received on Bridge is an XML message with schema as below:
We are extending the dynamic features available on Azure Topics by implementing the Brokered Message properties collection (equivalent of promoted properties in BizTalk). Our goal is to dynamically assign the values to properties at runtime. To implement that we have created a metadata record on an order that will identify the Property Key(s) that need to be promoted at runtime for the Azure infrastructure recipient.
Bridge custom code
The OrderMetadata Bridge is extended to include custom code to promote the properties at run time. Custom code on the bridge is available on Decode, Validate, Enrich, Transform and Encode stage.
Custom code can be created on either before message enters into the stage (OnEnter) or after message exits the stage (OnExit).
Following screenshot displays the custom code inspector (On Enter) properties for the Enrich Stage within Bridge. There are step by step instructions on creating custom code in bridge on MSDN.
A few key points to keep in mind while creating custom code for Bridge:
- Fully qualified assembly names (with strong name) need to be entered as a Type on Code Inspector.
- Copy Local needs to be set to true.
- IMessageInspector (of Microsoft.BizTalk.Services namespace located in Visual Studio Extension directory – Common7\IDE\Extensions\Microsoft\Microsoft Azure BizTalk Services SDK\) must be implemented.
- Logic in Task.Factory.StartNew method must be implemented.
The following is the custom code implementing the IMessageInspector. The code that is reading the input message that was received on the bridge and based on the metada key, finds out the value of the actual data element.
The code above is generic enough to promote the properties based on Metadata key(s). Alternatively a schema could have been modified to promote all the properties of an message based on a specific record (say Messageheader).
Routing and Filtering
Once the enriched message exits the bridge, it is sent to Azure topic regionroute. Topics give us the ability to add subscriptions on the fly and hence creating the message recipient dynamically.
Topic is created with 2 subscriptions
- Subscription Illinois with Default rule of “region=’IL’
- Subscription Washington with Default rule of “region=’WA’
Once the artifacts are deployed they can be viewed and verified in BizTalk Services portal as shown below :
I like to use Service Bus Explorer tool to look into the Azure infrastructure. Exploring the message into subscription, it can be noticed that message custom properties contains the promoted value for the metadata Key defined which was used to route to right recipient.
Below is the detail tracking information on all the activity performed when the bridge was executed.
Upon clicking on detail on one of the tracking activity, it will show the detail message information. In the case below it shows the all custom message inspector that was executed on ordermetadata bridge.
The message was finally routed to regionroute topic:
Install Windows Azure BizTalk Services SDK. Developer SDK features the support for BizTalk service Project template in Visual Studio.
Provision BizTalk Services.
Required information to Deploy BizTalk Services Project.
- Drop orderinstanceIL.XML (region = IL) or orderinstanceWA (region = WA) into FTP Source Location (free FTP hosting is available at DriveHq.com)
- Ordermetadata brige will pickup the data from FTP Source
- XML message received will be used to promote the properties into Brokered Message collection based on the metadata key
- It will be sent to regionroute Topic
- Based on the rules defined data will be routed to appropriate topic subscription.
This example showed how the WABS and Azure technology provides some new (Topics with rules) and existing (Promoted properties) features in the Windows Azure Cloud. The queues and topic infrastructure give us a much more dynamic way to route and subscribe to messages in our integration solutions with WABS feature to implement custom code in the bridge.