2012 Illinois Technology Association - CityLights Finalist

BLOGS

MPS Partners provides functional and technical expertise and insights into business process management trends and Microsoft technologies.

Contact Us
HOT JOBS
Interested in reviewing our current job openings or submitting a resume?
CLICK HERE
Home » Blog

Integrate 2014 – BizTalk Roadmap

Integrate 2014 – BizTalk Roadmap

Microsoft’s focus at Integrate 2014 was on application platform/Integration in a mobile first, cloud-first world.  The key message was to deliver experiences that consumers like. These include shifting running app on mobile device, using touch, voice and sensors with data coming from people, as well as devices and systems that are orchestrated globally using SaaS services.

There is also a shift seen in B2B/EDI implementations with a need for global partner implementation to simply the partner onboarding process and support “Cloud Friendly” protocols.

Microsoft will be refactoring the application platform based on the following guiding principles

  • Integrated app services
  • Secure, hybrid cloud ( Hybrid is still reality and provides integration with on-premises connectivity )
  • SaaS connectivity
  • Business user friendly (Self-service oriented approach using web based experience and modern mobile devices)
  • Ecosystem driven
  • Enterprise grade
  • Cloud scale
  • Open (Support tools and technologies in open source area)

A refactored application platform will be delivered with Integration at the core to build modern applications.

  • Common application container and cross-platform extensibility mode
  • Integrated API management  to protect business critical solutions with authentication and key management and built in logging for analytics and insight
  • Microservices runtime and development ecosystem that will support cross platform application, with deep diagnostics logging functionality
  • Cloud Orchestration Engine will be built and delivered by BizTalk Team leveraging various microservices.  Web-based process designer toll will be created to enable developers and business users to easily define and track business process.   Orchestration engine will support both on-demand and long-running processes.
  • Complete Web + mobile experiences
  • BizTalk Microservices will be built and made available by Product team on Azure Ecosystem providing
    • Out of the box Standard protocols ( HTTP, Ftp, Sftp, POP3, Rest )
    • Common Enterprise Apps ( SQL Server, SAP, Oracle, IBM Db2 )
    • Most popular consumer and enterprise SaaS connectivity ( Azure Service  Bus, Blob Storage, OneDrive, Salesforce, Dynamic CRM, WorkDay, SuccessFactors, Yammer )
    • BizTalk Mediation Patterns & Feature set (Validation, Batching, Debatching, Transformation)
    • Business Rules
    • Trading Partner Management
    • AS2 Protocol
    • Batching
    • EDIFact and X12 Protocols

Refactored application platform updates will be released (preview) in Q1 2015. From the on-premises perspective, the message was to migrate BizTalk on-premises version to 2013 or 2013R2.   BizTalk Team did reiterate plan on release cadence to have Major version every year and minor version every other year. Below is the Support Lifecycle for BizTalk, OS and SQL Server.

 

BTS End of Support BTS End of Life OS End of Support OS End of Life SQL End of Support SQL End of Life
BTS 2013 R2 July 2018 July 2023 Windows 2012 R2 – Jan 2018 Windows 2012 R2 – Jan 2023 SQL 2014 – July 2019 SQL 2014 – July 2024
BTS 2013 July 2018 July 2023 Windows 2012 R2 – Jan 2018 Windows 2012 R2 – Jan 2023 SQL 2012 – July 2017 SQL 2012 July – 2022
BTS 2010 Jan 2016 Jan 2021 Windows 2008 R2 – Jan 2015 Windows 2008 R2 – Jan 2020 SQL 2008 R2 – July 2014 SQL 2008 R2 – July 2019
BTS 2009 July 2014 July 2019 Windows 2008 – Jan 2015 Windows 2018 – Jan 2020 SQL 2008 – July 2014 SQL 2008 – July 2019
BTS 2006 and 2006 R2 July 2011 July 2016 Windows 2003 – July 2010 Windows 2003 – July 2015 SQL 2005 – April 2011 SQL 2005 – April 2016

 

Product team also shared the general guidelines on how existing customer’s on-premises BizTalk implementation can leverage cloud features (hybrid scenario):

B2B

  • With public internet protocols for EDI communication, we can migrate EDI features on the server and provide a seamless integration with the same functionality on MABS 1.0

Azure Recovery Services

  • For disaster recovery across geographies, Azure recovery services are recommended for Backing/Restoring BizTalk Databases

Adapters on the Cloud

  • Web-facing adapters can be hosted on the cloud and can be made highly available (Http, Https, As2, Ftp, Sftp )

Tracking Archive & Analysis

  • Use of DocumentDb or similar store as archive store for tracking data can be leveraged to run complex BI queries using PowerBI for instance

Happy BizTalking!

Posted in BizTalk, Cloud Computing, Integration | Tagged , , | Leave a comment

Test your RESTful API using PowerShell

Before I get into the “how” on this topic, I want to start by addressing the “why” and “when”, as in “Why / When the heck would I want to do this?” Here are some situations I’ve run into or thought of where having a functional test script was (or could be) rather handy:

  1. You have unit tests for your API, but also want some “client-facing” tests as well.
  2. You want to make sure when a new build is posted, the routes operate as they did before.
  3. You inherited an API that you need to maintain, but there are no unit tests.
  4. You need to provide some kind of automated test for your application, but do not have time / budget to write a full suite of unit tests.
  5. You do not like writing unit tests, but still need some kind of test coverage solution.
  6. You REALLY like automated tests; can’t get enough of them :).

So now with some (mostly) real-world uses in mind, let’s take a look at how this can be accomplished with PowerShell. A link to my code is posted at the bottom of this column in the last paragraph; take a minute to read through it, then come back to the next paragraph. I want to point out a few nuances that will make more sense within the context of the code sample.

Finished? Cool – let’s dig into the nitty-gritty:

Invoke-RestMethod vs. Invoke-WebRequest

If you read the code carefully, you probably noticed that I’ve used two different PowerShell functions to call the API – Invoke-RestMethod and Invoke-WebRequest. Gah? Why not just use one or the other? Well, I ran into a known issue while writing a test script for an API I built recently, and this was the recommended workaround. Essentially, if you’re going to make multiple subsequent PUT or DELETE requests from a PowerShell script (which you’ll probably want to do at some point), Invoke-WebRequest is the method to use for those HTTP verbs.

-SecondsAllowed

Each of my RunRoute functions in the script has a $SecondsAllowed parameter. This allows you to do strict performance testing of your routes (should you choose to do so) by specifying the number of seconds until the route times out when you call each function using the -TimeoutSec parameter. If you don’t want to set a timeout for certain tests, just pass a zero.

-DisableKeepAlive

I include this parameter on every request. According to the documentation, this tells PowerShell to “[establish] a persistent connection to the server to facilitate subsequent requests.” That seems harmless, but in a production setting, we have no clue IF there will be subsequent requests by the caller. I’ve noticed test scripts will run slightly faster without this parameter, which can disrupt the timing of the -TimeoutSec check discussed in the previous paragraph, and potentially cause you to get a false sense of “real-world” performance.

-Headers

Invoke-RestMethod allows for a -Headers parameter, but I’m not using it in the code sample. The test API is very basic, and doesn’t require me to provide any headers to interact with it. However, I want to mention it here because you’ll likely need to add this to tests for a production API, since this is usually where you would exchange security information with the server. The param expects a hash object like so:

$RequestHeader = @{};
$RequestHeader.Add("Key", $Value);
Invoke-RestMethod ... -Headers $RequestHeader;

You can put as many custom headers into this collection as needed. Check the documentation for further details.

LOG_FILE

I don’t specify where to create the log file; it should default to – C:\Users\{your-name}. Simply update this variable to change the output location. The log will contain an entry for each RunRoute call you make, indicating PASS or FAIL of the executed route, as well as total run time for all of the routes in your script.


Here’s my script –  it’s available as a Gist on GitHub.

The API which I wrote the tests against (jsonplaceholder) is publicly accessible, so you should be able to just download the code and start running it. I’m hoping once you’ve done that and stepped through it a few times, it will be fairly clear how you can modify the structure I’ve provided to fit your specific API.

Posted in .NET Development, Web Development | Tagged , , , | Leave a comment

What separates a cloud environment from the rest

The term “Cloud” is frequently misused and abused in the industry, most commonly mixed with “some service accessible via the internet.” This is particularly misleading because it is hard to find services that are not available via the internet.

A server connected to the internet in not a cloud environment. If that was true, the laptop that I am using to create this blog post would qualify for my personal cloud, since it runs Hyper-V which hosts a few virtual servers. It does not qualify to be a cloud environment even if the servers were accessible via a static IP or a domain name, as the case is with most data centers.

Here are some must have features that make a cloud environment. These points are not only about Platform as a Service (PAAS) which leverages cloud as a development platform but are also about basic Infrastructure as a Service (IAAS) where you lift or build your existing infrastructure in the cloud.

Geo Independence:
Perhaps the most important part of having your infrastructure in the cloud is Geo or location independence. This means that you can host your server at different geographic locations based on your user-base and traffic source. For example, Azure is available in 141 countries, with 8 regions within the US itself. So when you are creating or uploading a VM in Azure, you can specify the region in which this server will be hosted and manage your farm and place your servers closer to target markets / users giving them a quick response time.

Horizontal scalability:
The ability to increase capacity by adding up multiple application servers so that they work as independent logical units is referred to as horizontal scalability. The classic example of this would be an online retail store during the shopping season. When you are expecting more traffic, the business can quickly add more servers to the cluster to maintain high availability. At the same time when the traffic slows down, these servers can be shut down and removed just as quickly to reduce non-required infrastructure cost. This is a huge advantage for any business situation be it the online retail store or a proof testing an idea. In the latter case the business might want to start with one lean server and add up as the business progresses.

Administration:
In the case of a real cloud environment, there are multiple ways to get to your infrastructure for the purpose of making changes mentioned in the above points or for day to day health monitoring and maintenance. In a typical data center scenario, to add or change the configuration of a server, you would have to contact your data center support team. In case of a cloud environment, these activities could be completed by logging into a self-service portal or maybe just running some scripts. The portal typically gives you a snapshot of your overall infrastructure with many inbuilt utilities like health monitoring and analytics like traffic sources and other business critical client information.

The cloud environment also comes with a lot of security built in. Your application might be shielded from many vulnerability attacks just because it is hosted in a secure infrastructure. Also, depending on the cloud provider, you might not have to worry about infrastructure compliances like HIPPA or EU Model Clauses depending on the nature of the business. Here’s a list of compliances that Azure brings to your application without any effort on your side.

Time to market / Integration to development environment:
Imagine this scenario: A developer makes changes to a development branch in a version control system. As soon as the changes are checked in, a build script gets triggered which:

  • Spins up a new VM
  • Takes the latest changes, creates a build and deploys it on the VM
  • Runs a suite of Unit and Regression tests and provides feedback to the team if anything fails
  • Automates User acceptance testing on this VM
  • Saves the successful release candidate and tears down the VM

This model is referred to as Continuous Integration (CI) & Continuous Delivery (CD), and a cloud environment can fully automate this. Because everything is running in the cloud you don’t have to buy or manage servers for build or test environments. You can spin up environments using automated build scripts, run in-depth tests against it and drop it when you are done.

This model significantly cuts down the time to market for new changes and ensures repetitive and consistent quality check processes. With the cloud environment integration, the cost of delivery drops dramatically since the amount of money you will spend on this will be limited to the time that your machine is actually running, that is, the script is testing the build.

Posted in Cloud Computing | Tagged , , , | Leave a comment

BizTalk Advisory and Support

To help combat the complexity of today’s global, 24/7 service-centric marketplace, organizations implement architecture to help them achieve better business agility and greater interoperability. Organizations leverage the power of Microsoft BizTalk® Server as a trusted solution for seeking a new generation of dynamic applications that will provide quicker process change, improved business insight and a competitive advantage.

With the growing use and capabilities of BizTalk Server, it can be challenging to capitalize on the full potential of the product. Enterprises struggle with the increasing cost of building, maintaining and upgrading BizTalk’s integration solutions. They need to address the short-term and long-term value of the platform and have the right IT strategy for their business needs to enable broader cost savings.

To support companies that run any version of BizTalk, MPS Partners offers ongoing support of the server, providing a strategic and flexible delivery model to address specific needs. As a part of this offering, we:

  • Manage the BizTalk implementation
  • Establish governance
  • Provide ongoing technical support
  • Outline an adoption and deployment strategy
  • Review infrastructure, software configuration, application development or other key architectural elements
  • Mentor internal resources on best practices and technical approaches
  • Build artifacts or enhance existing artifacts
  • Create and maintain and issues and resolution knowledgebase

Customers purchase a block of hours each month to use as they wish for any level of BizTalk planning, development or support. We work closely with organization’s existing internal IT experts to ensure BizTalk server implementations are designed, developed and monitored in the best way possible.

Posted in BizTalk, Integration, Microsoft | Leave a comment

How to Solve Material Master Data Problems in SAP ERP – Part 1

Master Data is the lifeblood for any organization that uses an Enterprise Resource Planning (ERP) system such as SAP ERP.   In this blog series we will discuss some of the challenges associated with creating and managing master data and in particular material master data.

This first blog in the series will provide some basic background and definitions for some of terms we will be using throughout this blog series. This is especially important for those folks who may not be familiar with all the business processes where material master data is used.

Most successful companies use enterprise systems to complete the work needed to achieve their goals whether it is creating and delivering products or providing services to their customers. Companies have spent enormous amounts of capital to implement enterprise systems such as SAP ERP to allow them to manage a business process from beginning to end in an integrated and consistent manner.

One of the main goals for implementing an enterprise system is to improve efficiency by removing major obstacles to sharing and accessing information across functional areas.   SAP ERP is one of the most complex Enterprise Systems and focuses on operations that are performed within an organization.

It is quite common to refer to a specific set of functionality as a functional module in SAP.   For example, a person well-versed in Material Management is called a MM Expert or an SAP MM Functional Expert. There are many modules in SAP that are designed to perform a specific business function within an organization. Some of the examples are Production Panning (PP), Material Management (MM), Sales and Distribution (SD), Financial Accounting (FI), etc.

It is also important to get a basic understanding of how the data is stored and organized in SAP ERP. There are basically three different types of data namely organizational data, master data, and transaction data. In this blog we will focus on Master Data and in particular Material Master Data.

Material Master Data is the most utilized and hence most important data in SAP and is used to store all the data for a material in SAP.   This data is also made available to a number of processes such as Material Management, Procurement Process (Procure to Pay), Fulfillment (Order To Cash), Inventory and Warehouse Management (IWM), Production Planning (PP), Shipping and Quality Management.

In order to manage a vast amount of data, material master data is organized in different tabs or categories commonly referred to as Views.   Each view stores data specific to a particular business function.

For example, views related to a purchasing process are basic data, financial accounting, purchasing, and plant/storage data. The views related to fulfillment process are basic data, sales organization data, and sales plant data. The views related to Production are Material Resource Planning (MRP) and work scheduling for each plant.   Similarly, the view related to Warehouse and Inventory management is Warehouse Management.

Material Master Views

Figure 1: Material Master Views/Categories

As you can see from the above diagram, the Material Master data is comprised of data that relates to a number of different areas. For materials to be used successfully, the data has to be collected and entered into the Material Master record.

Not only does the data have to be entered for those areas but also for the specific organizational areas: plants, storage locations, sales organizations, and so on. For example, a material cannot be purchased without the relevant purchasing data being entered.

In addition, there are different type of Materials each with its own set of complex business rules. The most common material types are Finished Goods (FERT), Raw Material/Components (ROH), and Semi-finished Goods (HALB).

Now let’s discuss the Master Data that is relevant for Production Order Process. These are bills of material (BOM), work centers, product routings, and production version. BOM is also sometimes referred to as a recipe depending on the industry.   BOM identifies the components that are necessary to produce a material.   Routings identifies the operations needed to produce the material.   Work centers, as the name suggests, are where the operations are to be performed. A Production version combines a BOM and routings for a material.

Finally, I’ll cover the Master Data that is needed for costing a material.   The Master Data for product costing includes (Material master data [Costing view, MRP views], BOM and Routings).

The first step is to create material cost estimates.  The Material cost estimate is calculated for all the in-house manufactured products (finished goods and semi-finished goods). The system accesses the Bill of Material and Routing for the finished goods. It is used to calculate the cost of goods manufactured and cost of goods sold for each product unit.   Steps that follow the cost estimate are marking and releasing the cost estimate. Finally, the last step is to indicate in SAP that the material is live.

This completes a brief overview of the material master data and some of the major processes that are involved in an end-end process of creating a material in SAP. In the next blog we will discuss some of the challenges associated with creating and managing material master data in SAP ERP.

Posted in SAP Integration | 2 Comments

Agile BI: The Process & The Platform

When you hear the words Agile and Business Intelligence (BI) in the same sentence, what comes to mind?  Is it simply applying agile methodologies and concepts to your BI projects or is it something more?  Agile is defined as: “able to move quickly and easily”. It is characterized by being “quick, resourceful and adaptable”.  So what does this mean with respect to BI?  It means quickly responding to constantly changing information and analytic requirements.  It means visualizing data to quickly discern insights and increasing the quality of decisions.  It means having the tools and platforms in addition to appropriate practices in place to enable agility.

In order to understand the impact of agile BI on an organization, it is necessary to establish a foundation on how current BI solutions are developed and deployed and how traditional BI toolsets interact with these solutions. The following diagram depicts the classic approach to developing and deploying BI solutions from a data perspective:

The primary issue with this approach is that there is typically a long cycle time between when the requirements are defined, data sources are identified and a usable deliverable is produced.  This data process is often overplayed with classic waterfall approach.

The waterfall approach is a sequential process that is used in software development but can also be applied to BI solution, data warehouse and data mart development. A top down waterfall approach starts with comprehensive requirements gathering and data source definition before proceeding with design and development.  Conversely, a bottom up waterfall approach focuses on individual data marts and deriving the data source requirements.

A downside to the waterfall approach is that it contains a large amount of inertia, which precludes rapid changes in direction or design approaches.  That is not to say that it should be completely abandoned for enterprise wide BI solutions, but that it should be augmented to increase efficiency.

Another concern is that in a lot of instances, BI data may not be housed in enterprise systems.  In fact, 76% of all data needed to create BI reports already exists in standalone databases or spreadsheets. This means that content producers and information consumers may be able to gain valuable insights without extracting information from enterprise systems.

Also important to note is that users demand more from IT departments with shrinking resources. They ask for self-service platforms and reduced cycle times for BI solution deployments. They want advanced data visualizations that enable faster tactical and strategic decision-making.  In order to address these challenges, the traditional BI approach needs to adopt a more agile framework and platform.

Agile BI Methodology

The agile BI process can be depicted as:

This process, unlike waterfall, allows users to:

  • Identify smaller units of work and volumes of data, adding value throughout the process instead of waiting to develop an entire data mart.
  • Develop and deploy BI artifacts (reporting and analytics) alone or in workgroups.
  • Achieve insights earlier via data analysis and review instead of spending time defining a complex data integration process.
  • Adapt the process to specific needs and benefit from the flexibility of the Definition, Discovery, Evaluation, Deployment and Verification steps.

The Agile BI Platform

It is critical to acknowledge that agile BI is not just the implementation of an agile approach.  The platform is also an important component that can lead to the success or failure of the implementation of an agile BI environment. Evaluate solutions based on ease of implementation versus overall value.

 

The selected platform should address the following platform features and functions at each stage of the agile BI methodology:

Definition

With the need for rapid results and adaptability, agile BI platforms should deliver incremental value within a short duration. Ensure that the agile BI platform allows for social collaboration across multiple audiences and business functions of different sizes and domain knowledge. This will reduce risk and cost because issues will be surfaced earlier due to the smaller units of work.

Discovery

Considering that the content creator/information consumer has direct access to a large quantity of data, the BI platform should have the ability to easily connect to multiple heterogeneous data sources simultaneously. Consumers and content creators should be able to access and explore data with minimal IT intervention.

Evaluation

Data visualizations are key to agile BI platforms.  The platform must produce easy-to-understand charts and graphs in order to provide additional insights and facilitate data exploration. Questions asked of the data can serve to define an overall enterprise architecture at a later date.  This also means that there will be a much deeper collaboration between the content creator, information consumer and IT (co-location).

Deployment

The agile BI platform should facilitate fine grain deployment to audiences of any size and support the social interaction of the audience as the deployed BI artifacts (reports, visualizations, etc.) are reviewed and discussed as the verification process commences.  This further enhances and focuses collaboration, which can lead to better decisions being made at a much faster pace.

Verification

The most important step in the above process is closing the loop between “Verification and Definition” (Re-Definition).  It is imperative that this is performed in order to adapt to constantly changing information and analytic requirements.

The deployment of an agile BI environment is a journey comprised of agile methodology and a supporting BI platform.  It is not necessarily a monolithic environment, however.  There may be various best of breed components of a well-functioning agile BI environment, providing specialized capabilities to fulfill specific business needs. The agile BI environment should continually evolve based on changes in information and analytic requirements, market conditions, organizational DNA and unforeseen future needs.

 

 

Posted in Business Intelligence | Leave a comment

Business Critical Line-Of-Business (LOB) Data in SharePoint 2013 – Using Winshuttle for no-code or minimal-code approach

Part 1 | Part 2 | Part 3

In this part 3 of my blog series, I’ll discuss a minimal code approach to integrate SAP with SharePoint. This approach would mean utilizing a tool or toolset to provide an end-to-end solution. I’ll further discuss how we can use the Winshuttle suite of products to enable integration of SAP with SharePoint 2013.

Using Winshuttle, one can create an end-to-end solution rather than just creating services that expose SAP data.   Winshuttle has products that allow you to connect to multiple SAP touch points such as SAP Transactions, BAPI’s, and tables and expose the data as a web service.   There are lots of tools in the market that allow you to connect to SAP BAPI’s and tables but Winshuttle is unique that it allows you to record entire SAP transactions such as MM01 to create a material in SAP.   This allows a non-technical person well-versed in creating SAP transactions, such as a Master Data Analyst, to create a powerful web service that exposes a complete business process in a matter of few minutes.

A separate set of products from Winshuttle then can be used to not only create an electronic form that can be used to consume the web services but also design the business workflow process using a designer tool.     In the end you can integrate most SAP business process without writing a single line of code.   For example, you can create a SharePoint portal that can integrate SAP Master Data with SharePoint and further allow users to participate in a workflow.

A vast majority of solutions can be implemented by someone who is well versed in the tool and has good understanding of SAP LOB data.  This is a significant benefit as it reduces the constant dependence on IT, and the ability to have more self-service type solutions makes business more agile.   As a result, the savings are significant as even the simplest self-service solution can save countless hours that would be spent otherwise in manually entering the data in SAP.

Now anyone who has been involved in creating SAP integration solutions for the Microsoft stack of products knows very well that no two SAP integrations are the same and that no product in the market will be able to solve all the business problems.   Ideally you want all your customers to build their requirements around the strengths of an enablement platform such as Winshuttle.   However in reality that’s not always the case.   Very frequently you will run into issues where a product at-least out of the box is not going to solve all your needs.

This is where Winshuttle is different.   Yes, it allows you to create moderately complex no-code solutions but for those advanced application-like scenarios, the platform allows more advanced users such as IT programmers to extend the solution.   The person implementing the solutions doesn’t necessarily have to be an IT Programmer but, it will certainly help to have such a person implement more advanced solutions as they will require some custom coding. As a result, extremely powerful SAP data automation solutions can be created using Winshuttle with some minimal coding.

Part 1 | Part 2 | Part 3

 

Posted in SAP Integration | Leave a comment

Line of Business Driving Agile BI

BI-infograhic3It is no longer acceptable to wait months to build a data warehouse, create reports or add new insights to a dashboard, the world and business are moving too fast. Today, customers already have solutions in place that they are using to provide insights to most of their questions. In fact 83% of data already exist in some reportable format. The rest of the information they are looking for also exists but is typically stored in local databases or spreadsheets. The goal of an Agile BI solution is one that can quickly connect the existing BI solution and these new databases or spreadsheets to deliver new insights.

OLD DOG, NEW TRICK

The traditional BI approach involves defining business requirements at the entity level, building and architecting a data model, configuring an ETL process and then building reports and dashboards over the data. An Agile BI approach starts with the data visualization and leverages existing data sources regardless of format or structure. It leverages the cloud for quick results and focuses on the user experience and collaboration.This Agile BI approach allows users to see results in day or weeks, not months or quarters.

BI-infograhic2We are not saying you should build your data warehouse on standalone spread sheets or databases, but as new insights are being defined and tested, a more agile approach is required to quickly validate assumptions and deliver results. In the last two years we have seen an explosion in the Line of Business driving business intelligence solutions. In most instances these people have a deep understanding of the data and they have already built interfaces or created extracts for the information they need. The goal when working with these LOB users is not to build the perfect data warehouse. The goal is to leverage what is already in place and deliver results. Most companies have all the traditional reports and dashboards already in place to run the business. The new insights LOB are looking for involve combining the existing solution with new data sources.

CONCLUSION

BI-infograhic1aCloud solution, Public and Partner Data Sources are offering a completely new opportunity to uncover new insights. Companies today are finding value by combing existing data with 3rd party or public data. Census data, retail data and consumer data all offer new ways to look at the business and deliver value. New data sources and tools offer opportunities for Line of Business users to validate assumptions and uncover business value by leveraging Agile BI.

Posted in Business Intelligence | Leave a comment

Design and document your RESTful API using RAML

Creating documentation is one of the more tedious aspects of the software development process. Not that writing algorithms and object-oriented class hierarchies isn’t tedious, but it often seems (at least to me) that writing plain text explaining your code is more tedious than writing the code itself. Also, if you consider yourself an Agile practitioner (as I do), you’re always evaluating the time spent vs. value added whenever you write much about working software outside of and separate from the working software you already wrote.

I find myself constantly analyzing and re-analyzing the darn thing. Is it too long? Too short? Too wordy? Oversimplified? Too granular? Too vague? Should I have even attempted this?

I don’t claim this conundrum is unique to me. It’s one of the main reasons that many software projects are undocumented or under-documented. Because of this, a tool that helps you write clear, simple documentation, and saves you time doing it, is an invaluable find.

RAML is one of these fabulous creatures.

I was introduced to RAML by a colleague while working on an ASP.NET Web API project over the the last few months. What I like best about RAML is that it solves multiple documentation problems through the creation and maintenance of a single file:

  • You can design and modify your API collaboratively without writing / re-writing any code.
  • The syntax is simple and to-the-point, reducing time needed to properly describe the routes, actions and parameters.
  • This document serves as a team “contract” of sorts throughout the development process. Server-side developers, client-side / mobile developers, and testers – everyone – can see and agree on the intended behavior and content of your API requests and responses.
  • When the project is done, you’re left with an excellent artifact to hand off and / or publish describing exactly how your API works.

Now that’s what I call valuable documentation. To get started, all you really need to write a RAML doc is a text editor and access to the spec. However the team over at MuleSoft has developed a superb browser-based editor for RAML, complete with spec references and “intellisense” of sorts. Best of all – it’s free to use. They’ve labeled it as “API Designer” on their site; here’s a sample (click the image for a better view):

api_designer_screenshot

I used it for the project I mentioned above; I found it to be far quicker than writing a text-only version. Being able to click on and view the routes on the right side of the screen was also a really handy feature during various design and testing discussions. I found the API Designer to be a solid tool, and highly recommend it.

So the next time you kick off a REST API project, *start* with the documentation – and write some RAML.

Posted in Mobile Solutions, Web Development | Tagged , , , , | Leave a comment

Mobile First – Cloud First Simplified (3 of 3)

<-Previous post for this series

The conversation about business outreach (service first) and infrastructure elasticity (cloud) does not feel complete without including…

Big Data

Every generation of technology upgrade has created a need for the next upgrade in some ways. User created content and social media initially drove the need for big data techniques. However, the drivers to this movement added up pretty quickly because what big data analysis and prediction can do for business was quickly understood.

A Quick Introduction

Big data is commonly referred to as the mining and analysis of extremely high volumes of data. The data in question is not structured since it is collected from a variety of sources. Many such sources might not be following any standard storage format or schema. The data in question is also described by its characteristics, primarily – volume, verity, velocity, variability and complexity. The techniques for analyzing this data involve algorithms that engage multiple processors and servers that distribute the data to be processed in a logical way. Map-Reduce is one of the many popular algorithms, and Hadoop is one of the popular implementations of Map-Reduce.

Big data techniques are not something that only the corporations that collect social media dust need, it is something that every business needs to look into, sooner than later. It is a separate topic that every business needs to factor in social media data in some form or the other. Even if that part is ignored, the volume of structured data is increasing by the day.
BigData

Keeping all that in mind, it should be important to explain how well big data fits into the elasticity of the cloud. Imagine an operation where data needs to be segregated by some specific parameter on different servers. These different servers might run some processing depending of the type of the data or just store that data to improve access time. A true cloud environment will be the perfect host of such an operation. You can spin up new servers, having specific configuration, with just a few lines of scripts at run time.

Where are we heading

In 2011 Google Fiber announced Kansas City to be the first to receive 1 Gigabyte per second internet speed followed by Austin, TX and Provo, UT. As per the company’s announcement in February 2014, Google fiber will be reaching to another 34 cities. AT&T stepped up by announcing that its new service, GigaPower, will provide the gigabyte internet speed to cover as many as 100 municipalities in 25 metropolitan areas. Besides Google and AT&T many other large and smaller providers are working on targeted areas to provide superfast internet speed such as – Cox, Greenlight Network, Canby Telcom, CenturyLink, Sonic.Net etc.

Considering this new scale of data on bandwidth, the way application technology works is going to change, specially the part that involves Mobile and Cloud. It will be much more convenient to have a huge memory and processor centric operation running in a cloud environment, streaming status and results to the browser running on your laptop or a small hand held mobile device.

Moving the heavy lifting work to the cloud and keeping the control on low resource devices is not something that is going to happen. It is happening now; only the scale and outreach is going to increase exponentially. Everyone connected to this field should to pay attention to the changes and keep a strategy for the future, be it providers, consumers, decision makers, technology workers, business users and consultants.

Mobile first cloud first simplified.

Posted in Cloud Computing, Mobile Solutions | Tagged , , , | Leave a comment
 
View By Author
  • View By Category
View By Date
Part Of SPR COMPANIES
© 2014 SPR Companies. All rights reserved. About | Capabilities | Products | Solutions | Clients | Events & Resources | Careers | Site Map | Legal/Privacy