Continuing the simplification of mobile first, cloud first from the previous post…
Let’s highlight the two big objectives that are achieved by separating core business services and platforms specific client in the last post:
- Platform and device outreach – HTTP being understood by all modern devices, makes your service consumable by any device that can host a client application and understand the language of the web.
- Heavy lifting done on the server – With the separation between a client app and business as a service running on a server somewhere, all the heavy lifting is done on the server where as the user’s device is doing mostly the user interaction. The heavy lifting work is generally referred to complex computations that consume a lot of hardware resources like CPU, RAM which is generally a limitation on small mobile devices.
Now let’s talk a little about the server.
Is your application business ready or feature ready?
So now we have built our application in a restful manner to reach a broad spectrum of devices and we moved the heavy lifting on the server. At this point our business idea can either take off or send us back to the drawing board. In either case the load on the server that is doing the complex operation is going to fluctuate.
The question here is – is the application infrastructure elastic enough to support that… or is such increase and decrease in the infrastructure going to come with a heavy cost?
It is a difficult question to answer for any developer – how many users (or traffic) would the current server infrastructure be able to hold? The best answer that you will get would be a very careful calculation based on perhaps stress testing, overly padded with seasoned wisdom. In fact, in case of a new application or a rewritten application with the newer frameworks, it is very difficult to evaluate the ideal infrastructure requirement until the rubber hits the road. To be on the safer side, every team tends to overestimate.
Cloudy with heavy awesomeness
Moving the infrastructure to cloud will help you achieve such elasticity. You do not need to worry about contacting data centers really, you can spin off new servers and shut them down when not needed, using a few lines of scripts. Depending on the service you are using, you could do many infrastructure operations using a self-service portal and be charged for only the infrastructure you use, for the duration you used it.
Suppose after we launched our application, we found that out target customers are in a specific geographic location like the east coast or some other part of the world that our analysts never imagined. Can you quickly respond to the new found opportunity? Most cloud service providers will allow you to select the geographic location of your infrastructure, allowing to place more servers closer to the customer for optimized user experience.
Global cloud providers are large organizations that have heavily invested in the infrastructure over the years thus providing you high security and availability. Therefor, there are many benefits that your business gets by moving to the cloud that might be difficult to estimate beforehand.
The device and service situation
Over the last decade, the tech industry has seen the exponential growth of a variety of devices (laptops, mobile phones, tablets, gaming consoles sensitive to touch/voice/motion/gestures). This is only going to get more diversified, whether we talk about 10 new devices within the 5.75 inch and 6.85 inches sector or the consoles / wrist bands that will replace medical equipment or watches that will talk to phones, glasses and TVs. Yes, the forecast is intense.
Relax, take some REST.
Given we can only partially foresee which future devices will be available, how do we maintain consistent delivery of business functionality to the devices that are still to be developed? Well, build a service that can communicate with all the existing devices and can also take care of the REST.
For decades we were happy with XML and SOAP messages communicating across applications. But now this communication has grown beyond traditional applications and devices. From some remote server hosted somewhere in the cloud to smart touchscreens, from set-top boxes to gaming consoles, some of those devices understand SOAP but many do not. However all of them positively understand HTTP. HTTP is what connects everything to the “Cloud” and staying away from either might not be a good idea.
Although the term “cloud” is heavily misused for sales purposes, I will come back to it a little later, let’s first talk about…
What is Service First?
So once you have identified that your business functionality will be delivered via HTTP, you should build your logic and let it be consumed via the HTTP service. HTTP service is more commonly known as RESTful service or REST service Web APIs. This is an architectural pattern that embraces HTTP as transportation protocol. So RESTful services are basically services built to be consumed over the web via HTTP.
Once we have functionality ready and available to be consumed across the spectrum, we can go ahead and create client apps for as many platforms and devices as we want. As far as maintenance is concerned, any change in the business logic will be one change to your service, and a consistent UX will represent the brand.
Let me start this post with a small incident that happened in our organization a couple of years back. We had switched our insurance provider and the representative of the new provider was giving a presentation of the benefit plan. Then came a slide that had their “mobile app” mentioned, among other things. While they talked how one could look up health care providers and other information via the app, someone mentioned that they couldn’t find the app in the store. After a few seconds of confusion, the presenter assured to get back with the information about the app.
The disconnect was – the app they were talking about was a mobile web site, where as people were instinctively searching in the app store. This brings out many interesting points. One of the most important being – the discoverability of your application. When you look for an app that provides a functionality do you open the browser and search online? Maybe, depending upon what are you searching for, but it is highly likely that you will look for it in the App Store of your device.
Which leads us straight to the ultimate option of targeting multiple devices and platforms:
Publishing the application in the app store of the platform.
There are a few ways to go about it –
One would be to choose the device/platform and re-write an app using the platform’s API and language. But given that right now we are talking about targeting multiple devices and platforms, I would not recommend it.
The second will be to abstract the business logic / functionality to a restful service and write multiple UI clients (AKA Apps) using the native platform. That is a decent option, however there is a cost associated with hiring multiple developers each with their own platform skills (Objective C with iOS, Java with Android, C# with Windows 8/Phone etc.). The cost for development and maintenance adds up, every time a new version of the platform is released and newer devices stop supporting older platform versions.
The ultimate goal is to implement this option using the existing skills and without re-writing the entire application. This can be achieved in two ways. But to achieve either, the application has to be well structured. Any well written application defines the boundaries of at least the three traditional layers – UI, Business layer and Data access later. To be able to reuse most of the code across apps, it is rather required to separate out at least the UI layer from the rest of the application logic.
Let’s take a look at these options now:
1 – Speak the Universal language – HTML5
To get the full benefit, the reusable business and data access logic should be abstracted into a RESTful service which would be consumed by your application that will be consuming the PhoneGap framework.
2 – Hire an interpreter – Xamarin
Xamarin allows you to write native iOS, Android and Windows store apps using a single application platform (Microsoft .NET) and language (C#). Applications build in Xamarin enjoy full native access to platform APIs, get native performance and user interfaces. Xamarin provides the ultimate ease of development and maintenance by integrating its tools right into Visual Studio and by extending Microsoft tooling. Microsoft released Portable Class Library (PCL) to develop functionality using C# to be used across a verity of platforms. This was initially created to support Microsoft platforms like .NET and Windows, Windows 8 Store, Windows Phone, etc. In 2013 the PCL feature was extended to iOS, Android and Mac using Xamarin. Since PCL is independent of the platforms it does not support some platform specific features, like certain encryption. Also Entity framework 6.X is not supported PCLs, but as per the forums, EF 7 will be.
That said, you can create the shared business logic of the using PCLs, which each separate client application can use. The new release of Xamarin includes a new feature – Xamarin Forms which allows the sharing of UI logic across platform too.
This is the final part of a 3 part series on Power BI. Part 1 discussed the tenets of self-service analytics and how Power BI and be leveraged. Part 2 provided a more in depth discussion of Power BI capabilities. Part 3 will look at Power BI deployment scenarios.
Before deploying Power BI, you must clearly understand who your content creators and content consumers are. The following personas typically exist within an organization:
The Executive needs information that is highly aggregated to give a high-level picture of the state of a business or functional area. Often presented in the form of a dashboard or scorecard, information for the executive is intended to spark questions and drive strategic decision making.
The Analytist needs raw or lightly summarized data to consume for the purpose of creating a detailed analysis of a specific business problem or opportunity. Analysts present data in the form of spreadsheets, presentations or ad hoc reports that have limited use beyond the specific problem they are tasked to solve.
The Manager needs information that provides a detailed analysis of a specific business area or function. Data for the manager is used in planning future activities or assessing past performance. It is at a level of detail that can help define specific actions, such as a territorial sales plan or marketing activities for a product launch. Data for the manager is generally provided as reports or detailed scorecards.
Operational users deal with data presented at a transactional level. Examples of an operational report may include invoice registers created by accounting or a daily production plan published by a production planner. Operational data has been traditionally presented in the form of reports. Increasingly, data for the operational user is being presented on-line via Intranets or mobile devices.
Desktop applications, Web Browsers, Smart Phone/Tablet Applications
The mix of personas, audiences and consumption modes along with core business requirements provides input into deciding on how best to deploy Power BI as a whole or its individual components.
Deployment Scenario 1 : Desktop Excel 2013
This is the simplest scenario and can service the needs of individuals accessing local and remote data sources. In this scenario an Analyst is typically trying to address a specific domain problem related to an assigned task. The output of the solution may provide insights into a related issue or it may be a one-time solution. At this level Power Pivot, Power View, Power Maps and/or Power Query may be leveraged to address the specific analysis. One important characteristic of this scenario is that it may never venture far from the Analysts’ desk. Also, the content creators and consumers may be the same individual. The consumption mode is generally at the desktop level using Excel.
Deployment Scenario 2 : Desktop Excel 2013 Power Pivot workbooks deployed via SharePoint
This scenario expands on deployment scenario 1. This scenario also facilitates using multiple disparate data sources via Power Pivot and Power Query. A specific domain problem has been identified but the value of the analysis is far more reaching than an individual analysts’ desk. The problem may address a department need or the broader need of a functional area. SharePoint is leveraged for the dissemination of information to a wider audience. Data can be refreshed on a regular basis using Power Pivot refresh. Since the data is stored in the Excel/Power Pivot workbook, it is possible to drilldown into the transactions that support the analysis. This deployment scenario can support the operational, managerial, analytic and executive personas for domain specific problems. Any limitations are related to managing the volume of data stored in an individual workbook as well as preventing duplication of effort within an organization. The consumption mode is via desktop applications or Web Browsers. This includes smartphone and tablet based web browsers.
Deployment Scenario 3 : Power View, Excel workbooks deployed via SharePoint Leveraging Tabular/Multidimensional mode Analysis Services
In this scenario, the analysis can be much more detailed, and encompass much larger volumes of data. The physical data is typically external but may also be embedded in an Excel/Power Pivot workbook. Disparate data sources have been integrated via a Tabular model or an Analysis Services cube. Complex solutions can be deployed via Power View in SharePoint or Excel Services. This deployment scenario supports all of the personas and provides a great deal of flexibility in how the solution can be processed and delivered as well as a rich interactive user experience. The consumption mode is via desktop applications or Web Browsers. This includes smartphone and tablet based web browsers.
Deployment Scenario 4 : Power BI via Office 365
Power BI via Office 365 provides the ability to service all of the above personas as well as all of the consumption modes. As in deployment scenario 3, the analysis can be much more detailed, and encompass much larger volumes of data. This includes full access to all of the Power BI components (Power Maps, Power Query, Power View, and Power Pivot). Additional information consumption methods are available above and beyond the base interactive functionality. For instance, there may be a situation where a specific analyses does not exist. Power BI via Office 365 provides a question and answer mode in which natural language queries may be executed against the associated Excel/Power Pivot workbook data source (e.g. “Show total sales by product category”). A dictionary of business terms (synonyms) should be created to fully leverage this capability. The synonyms map information consumer concepts to the underlying analytic model. Information may also be consumed via a Power BI app available via the Windows App Store. The Power BI app is linked to an Office 365 site supporting the deployed Excel/Power Pivot workbooks. One item to note is that Office 365 BI capabilities (question and answer) are cloud based as a component of the Office 365 subscription. In deployment scenarios 2-3 on-premises or off-premises SharePoint can be used for information delivery. This raises the additional question of should the deployment mechanism for an BI content be on or off premises. This in itself is a complex topic and cannot be fully addressed without understanding an organizations specific requirements.
The above deployment scenarios by no means cover all of the possible options available for the deployment of Power BI, but they do represent the more common ones. If you are considering using and deploying Power BI the above scenarios can be used as a starting point and modified as required to fit an organization’s specific requirements.
It was a pleasure to present TDD as an evolving methodology and talk about its current state.
I want to thank the group for active participation while discussing application design, benefits & challenges around the methodology.
Also, thanks to Keith Franklin for the opportunity and organizing the event.
The slides and demo code from the meet-up can be downloaded from GitHub here – https://github.com/MayankSri/CNUG-TDD
The YouTube videos on discussion on “Is TDD Dead?” can be found on Martin Fowler’s site here – http://martinfowler.com/articles/is-tdd-dead
Leonardo da Vinci said “Simplicity is the ultimate sophistication.”
In 2012 at a time when business and technology was growing increasingly complex, MPS Partners evolved its thinking toward a more simplified approach to solve customers’ problems. We make the implementation of SAP solutions simpler using the Microsoft platform. We think about client engagements differently and challenge the status quo to deliver simplified yet sophisticated solutions that deliver greater value.
It was interesting, and a testament to our thinking and direction, to hear SAP’s new mantra is also around simplicity. I hope this materializes for them because simplifying all solutions on the SAP platform is an arduous undertaking as compared to MPS Partners simplifying a specific business solution.
Fiori is the new user experience paradigm that came out of SAP Labs a couple of years ago. It was a big announcement during this year’s SAPPHIRE that Fiori is now included free with SAP Software. Fiori is the next generation user experience built on top of SAP applications. This change was based on the feedback from both customers and partners.
With the consumerization of IT, SAP users have been asking for better user experience, so solutions such as SAP Business Suite are using SAP Fiori UX principles to deliver it. Fiori helps provide a consistent UX across multiple devices – desktop, tablet, and smartphone.
SAP HANA is at the center of everything that SAP is doing these days. I heard about HANA at every announcement/session I was in. It was a great discussion during the Day II keynote between Prof Clayton from Harvard and Dr.Hasso Plattner. The takeaway there is that we need to look across organizational boundaries in finance, supply chain, etc. and enable customer engagement. The HANA platform enables a non-aggregate based solution focused on the customer to get their job done. This supports both the transactional model and getting insights from the data collected.
SAP on Microsoft Azure
SAP now officially supports Azure as a platform for production systems with a combination of SAP products, Operating Systems and DBMS systems. This is a big step in the relationship between Microsoft and SAP as Microsoft Azure is becoming a key cloud platform provider.
I have always discussed with my customers how Microsoft Azure can be used for SAP sandbox environments such as DEV, QA and Training. Support for productive SAP environment is a big step though there are several limitations and kinks which I am sure will be fixed in time.
It was clear that it is the age of the omni-present user. We all carry multiple devices, we might start our day at home with our smartphones, switch to our tablet on the way to work, to desktop at work and back in reverse order as we end our day. Added to this are the Internet of Things world consuming/collecting data. Enterprise applications needs to address this. It is about devices, from your smartphone to your refrigerator.
The future is about providing highly-available solutions that provide consistent user-experience across devices anywhere, anytime.
In future articles I will be elaborating on each of these topics.