Jump to content

Dec 08

Survey Uncovers Financial Industry Challenges

Financial services organizations rely heavily on information found on public websites, social networks, and web portals to monitor markets, track the competition, identify suspicious fraud activity, maintain sanction lists, automate processes with B2B partners, and listen to what customers are saying. Access to these external sources of both structured and unstructured information typically requires manual integration. This leads to tedious searching, copying and pasting of data into spreadsheets, databases, or applications. This information is very often time sensitive, so organizational reliance on manual processes defeats the notion this gathered information is timely. Time is money. Kapow-FS-survey-infographic-Blog-Post

These organizations also depend on an IT infrastructure to meet these needs and data integration requirements must address the growing need of accessing external data sources. Integrating internal systems with external data sources can be challenging to say the least, especially when organizations are constantly adding new external sources of information to their operations, and these external websites and web portals either don’t provide APIs or the development efforts are too time consuming and costly. Keep in mind, if IT is struggling to keep up with demands of the business to add new sources of information and eliminate manual repetitive activities involving the handling and processing information now, the chance of keeping pace is a target that becomes unobtainable for most financial services organizations.

A recent Computerworld.com survey of 100 financial services professionals highlights the challenges of acquiring and integrating data from multiple data sources, including external websites and web portals. This survey revealed troubling challenges facing the financial industry. Here’s a few highlights:

  • Struggles of integrated external sources: 43% of the participating financial institutions are struggling with a lack of integration between external data sources and internal systems. Of those external sources that financial services organizations need to integrate with, 84% are web portals where business information needs to be extracted and integrated with internal systems and processes.
  • Manual or hard-coded integration: 55% of respondents reported that the integration of data between external sources and internal systems either involves users manually transferring data, or is done through custom integration development that involves a more hard coded approach that does not scale out to support the many external sources of data.
  • Manual data handling: These financial organizations identified the time required to manual import data and perform validations, as the two most costly results challenges when it comes to not being able to integrate external data sources.
  • Deployment delays: Overall, respondents want a solution that quickly adapts to varying data sources, unfortunately integration projects often takes months to complete. Only 8% of the financial services responding indicated that an external information integration project is completed in less than a month, and 31% reported it takes more than 3 months to complete, illustrating the need to find a faster and more efficient way to perform external data integration.

The bottom line is manual processes no longer fit into any financial organization business process. It’s clear these time consuming development projects used to integrate external data sources into an enterprise infrastructure are not a long-term viable strategy.

Financial organizations depend on data, whether it’s being used to transform industries, grow market share, defend brands, or protect customers. It takes an alternative approach to integrating data, one that cannot simply rely on traditional development tools and custom one-off projects. Data integration platforms that are easy to deploy and customize are the next step for external data integration.

Download the complete IDG survey here.

Tagged with:    
Nov 26

Recently, I participated in the yearly CIO Logistics Forum with Henrik Olsen, Head of Business Architecture & Development from DSV, a global supplier of transportation and logistics. The topic: Streamlining Logistics Operations & Automate B2B Processes.

During the presentation, Olsen observed an increasing demand from customers (e.g. manufacturer of goods) for lower prices, improved service and real-time data integration, and freight haulers asking for higher rates to compensate for increased costs.Kapow-Blogpost-diagram-Gross-Profit

With price pressures coming from both customers and freight haulers, one of the few ways to improve profit is to increase operational efficiency through automation of B2B interactions and internal processes.

Typically, automating these processes becomes challenging as more and more customers move away from supporting Electronic Data Interchange (EDI). This is especially true for mid to small-size customers who cannot afford to keep up with the demand for EDI integration.

These smaller customers “dictate” their preferred way of integrating and exchanging information. This integration is typically driven through an email-based solution and/or through a web portal, often using Excel as the interchangeable format.

A typical scenario might go like this:

  • An email with an order is generated directly from the customer’s Enterprise Resource Planning (ERP) or Transportation Management System (TMS).
  • The transportation and logistics supplier receives the email with the shipping order request and, processes the information.
  • The customer requires near real-time update of tracking information posted to their logistics web portal.

Of course, this process is simple for customers since they don’t have to support EDI, and they will choose to only engage transportation and logistics suppliers who allow them to deliver the data in this flexible manner.

For the suppliers, this order becomes more difficult and expensive, as the entire customer integration process becomes manual.

The good news is all these manual processes can be automated, and this is exactly what Henrik explained in his well-received presentation.

DSV plans to automate a considerable amount of these B2B non-EDI interactions, and take advantage of the higher freight prices and better margins that they can obtain from smaller customers. This is what I like to call the long-tail effect (see diagram), where technology like EDI is too expensive and complex to implement for low volume customers, but alternative solutions are available to facilitate the automation and integration between business partners.Kapow-Blogpost-diagram-(1)-Revised

Many transportation and logistics companies all over the world are finding alternatives to EDI where integration costs with smaller customers can be reduced as much as 100 times through complete automation of previous manual B2B processes. The result is a substantial increase in profits and improvements in the bottom line.

Many thanks to Henrik Olsen for presenting on this important topic. If you are curious to see how this works, I recommend you watch this short video.

Stefan Andreasen, Corporate Evangelist Kapow at Kofax.

 

 

 

Tagged with:          
Aug 06

Earlier this year I penned a blog about the ‘Unknowns’ of Big Data Integration.  The focal point of the piece was the need for businesses to become more situationally aware by quickly harnessing data from a variety of sources to try to prepare for a ‘Black Swan’ event in the course of their business cycle(s).

Last week, Kapow announced that we would join forces with Kofax to lead the industry in making data more integrated and actionable in the enterprise.  The immediate alignment of our companies and the scale of the Kofax organization have the potential to create a new disruptive force in the Big Data Integration sector.

Over the years, we (at Kapow) have always championed the “TIME TO VALUE” (TTV) metric as the real justification for making economic business decisions.  The timing to join forces with Kofax will accelerate the integrated data deliverables to our customers and the larger Kofax family of customers.

The past 3 ½ years as CEO of Kapow provided me with a great opportunity to work with incredible people (our family) in an effort to deliver the best product and service to our customers.  In turn, our customers have been extraordinary in their support of Kapow and that was clearly demonstrated at our first ever user summit this past March.

I look forward to an unparalleled growth period for Kapow as part of the Kofax team with access to Kofax’s extensive resources, financial strength and global presence.  The best of times lie ahead as we continue to revolutionize the way data, coming from a wide variety of sources, is used in the enterprise to drive the business forward with greater precision, agility and affordability.

Thank you all for your support over the years and for your continued support for many years to come.

Authored by: John Yapaola, CEO

Nov 06

KapowNow Webinars AdWe’re so excited to announce the KapowNow webinar series. This is not just another set of webinars. We’ve specifically designed them so everyone can fit them into their busy schedules. They are short! In less than 30 minutes, you can learn about application integration, process automation, rapid creation of business applications, Big Data, social media monitoring, mobile enablement and much more.

Watch live demos and get all the tips and tricks you need to make integration more stress free, your business users more empowered, your company more agile and innovative. This series will show you how to use Kapow Katalyst and Kapow Kapplets as the single application integration platform, solving disconnects across the organizations such as getting analysts their data, providing employees mobile access to legacy applications and eliminating manual process – just to name a few examples.

Register today and spread the word by tweeting and using the hashtag #kapownow.

By: Hila Segal, Director, Product Marketing

May 02

Traditionally, when we talk about how data relates to applications, we think of data as the bottom of the three layers of the application stack:

But that distinction is evaporating fast, and soon we will end up with only one layer: the application itself.

Modern application frameworks also define the data model and tie it tightly with the application logic and the user interface. This means that even if you could access the data directly in the SQL database, the data won’t make much sense without the application logic that ties it together.

Similary application frameworks are also used to build the user interface, which means that application logic in modern applications is also woven into the the user interface; for example, into the JavaScript of a web application, leveraging one of many AJAJX frameworks to create a vivid user experience.

Now let’s look at BIG DATA applications. These don’t even use standard SQL databases anymore, but specialized NoSQL databases like HBase and Hive, build on the Hadoop processing framework. Now capabilities like search and analytics are tied together in one platform and the data can only be accessed through the BIG DATA application itself. If you need to fresh up on NoSQL technologies, check out this good (albeit a bit old) article, Decline of the Enterprise Data Warehouse by Road to Failure blogger Bradford.

The consequence of all this is that data and applications are becoming one and the same thing.

Similarly, data integration and application integration products are becoming one and the same thing, and anyone who needs to access data, will think of data in an application, not data standing alone.

To deal with this, we need to develop innovative new integration technologies that can access data through the application — a hybrid data integration and application integration platform, so to speak, but much more agile, since the pace of business and the impatience of business users today don’t permit waiting months or years to leverage that data.

The Kapow Innovation is the scalable Kapow Katalyst Enterprise Application Integration Platform that can live both on-premise or in the cloud; a technology that can access the data for the modern world, a world where there are no longer defined application layers.

Do you agree or disagree?

By: Stefan Andreasen Stefan Andreasen

Tagged with:       
Nov 08

In 2007, James Governor penned an article, Why Applications Are Like Fish and Data Is Like Wine, depicting how data gets better with age, while apps, like fish, begin to smell over time.

Earlier this year, Chuck Hollis from EMC offered his own wrinkle on the topic, also making a case for keeping the data (wine) and dumping those ‘lumbering apps’ (stale fish).

Both offer interesting reads, but too quickly dismiss the value of apps. Fish and wine go well together and depending on how you pair them, can make or break the experience or value of the meal. The same is true in the modern enterprise – the fish (application) is equally, if not more important than the wine (data), especially when the apps are kept “fresh and simple” with “many varieties to choose from…”

Consider if all the data trapped within applications were easily accessed and made even easier to interact with other applications.

There should be a straightforward way to interact and automate as many ‘fresh’ application sources as possible, in the shortest amount of time. IT organizations can no longer remain ‘comfortably numb’ in their avoidance of these agile Line Of Business (LOB) activities.

So, does the LOB really want a customary IT application integration fix or are they asking for something different? When the LOB says, “I really need access to this data now!” what they are really asking for is to interact with applications (or websites) in an impromptu manner in order to keep pace with emerging business dynamics and acquire key data advantages for the business.

The blistering pace and expansion of social media access, in all its forms (brand management, blog monitoring, anti-piracy, competitive intelligence, analyst research, prospect & customer mapping, consumer behavior analysis, marketing research, partner & reseller communication, on-line videos, risk compliance, legal monitoring, corporate reputation, background checks, R&D innovation research, consumer & customer research, sentiment analysis, predictive analytics, social CRM) makes the LOB integration demand on IT seem like an insurmountable task. The intensification of interactions from sources without API coverage (or even marginal API compliance) can no longer shackle the LOB data integration needs.

The traditional corporate IT blueprint has been to deploy long established application integration methodologies. This old design is already showing the strain of abandonment. I would propose a modern application integration approach consisting of the following components:

APPLICATION INTEGRATION PILLARS:

All four pillars are key to providing real-time integration, but they must also move toward a new form of LOB automation – a self-service component for the LOB. Ubiquitous access with the ability to quickly prototype and test application interactions are the most critical components to this offering. Applications need to be integrated in order for the LOB to interact and drive time-to-value (TTV) in the enterprise.

Adding to the turmoil for IT organizations are the hundreds (often thousands) of in-house applications constructed over the years. The vast majority of these have not been SOA enabled. Those ‘lumbering apps’ are also inclusive of the LOB need. There’s valuable data trapped in those in-house apps that must be freed.

Integrating to applications will require more frequent and varied connections with real-time and on-demand communications. These integrations may be more permanent links to applications or have ‘throw-away’ integration conditions.

The LOB is forcing the application integration challenge to the forefront of the IT stack. Applications are the real-time component of the raging BIG DATA frenzy. The day is coming, when the LOB will self-serve most of their application and website interactions.

The data wine cellars of the enterprise will be cared for by the IT stewards, but the Line Of Business wants FISH.

By: John Yapaola John Yapaola

Tagged with:          
Aug 24

Solving the problem of application integration has always been slow, painful, and expensive.

In the original EAI (Enterprise Application Integration) days, integration meant buying (or building) application adapters (or interfaces) and then coding the connection between the adapters in an ESB/SOA framework. It was a heavy, complex project that depended on expensive software from IBM, BEA, or Tibco and required highly skilled developers. The business often waited years to receive the results it needed.

If we fast-forward to today, we live in a “connected” world and most of our business transactions and business processes are based on applications that may reside anywhere; on the web, inside our company, at our business partners or as SaaS apps in the cloud. Consequently, the business need to connect applications has grown to Herculean dimensions and traditional EAI/SOA methods are going the way of the dinosaurs.

To effectively deal with today’s exploding integration requests, we need a much more LOB-centric and far more agile method to integrate applications.

A number of “cloud integration” vendors like Cast Iron, Boomi, and SnapLogic have arisen, and even long-time larger ETL vendors like Informatica are trying to meet the LOB’s needs. They promise agility based on a concept of ready-made “API Connectors”, with the idea that if you want an integration from, say, SAP to SalesForce, you find the SAP-Salesforce API Connector, configure it, and you are done, or…are you really?

That’s the problem with this method. The promise sounds great, but the reality is far from delivering on the promise, for some obvious reasons.

I recently met with a CIO of a large Silicon Valley based company who said “We use more than 70 different apps in the cloud as well as a number of internal apps, and all I see are these ‘faces’ of the apps, without an easy way to connect them.”

Let’s say they have an additional 30 internal apps, this means about 100 apps to run their business. If you need to connect them all you will need at least n(n-1)/2=4950 API connectors. This is not taking into account that for each app, you might have dozens of different functions you want to perform, each needing a separate API connector. Well, you get the picture—solving this with pre-built API connectors is “mission impossible.”

On top of this comes the issue that many of those apps or functions don’t have API’s, which makes it really literally impossible just for that reason.

So what is the solution?

Well, what if you could get a product that allowed you to build a custom integration, with no coding, in about the same time it would take to search for an API Connector (assuming one exists)?

Wouldn’t that be awesome?

Well, this is exactly what Kapow Katalyst Application Integration Platform is. It uniquely combines: the power of a Cloud Integration product; a visual, flowchart-based design and development environment; a high-performance application automation browser; and a collaborative management console.

Kapow Katalyst Application Integration Platform is the 3rd generation of Application Integration products, already proven in production at more than 500 companies all over the world.

It delivers on the promise, every time.

This is the new, better way to connect your applications and automate your business processes.

Want to discuss, drop me an email.

By: Stefan Andreasen Stefan Andreasen

Tagged with:       
May 03

Do you dare to step out of your comfort zone? The world is changing at a faster and faster pace, especially in the world of IT.

Driven by the explosion of internet connectivity, distribution of data sources, and the number of legacy and networked applications, impatient business demands for IT projects are growing both in numbers and in complexity.

Line of Business needs projects delivered yesterday and with data as fresh as in the original data source. “Rotten” data is no longer acceptable.

In a situation like this, the traditional methods of working often fall short, something that can be very difficult to realize when changes happen gradually. Things don’t get out of control from one day to the next, but over months and years, until it’s too late.

I tried to illustrate the situation in the 3D graph above. The area of control (blue) is gradually turning out of control (red).

  • Data demand has moved from batch to real-time
  • Complexity has increased with the distribution of applications and data source
  • Business urgency requires unrealistic time-to-market goals
  • Number of IT projects is growing, but the budget is not

In a changing environment like this, we constantly need to challenge our processes and tools.

So my question to you is: “Are you ready to change the way you do things? Do you dare to step out of your comfort zone?

If not, you’ve got a problem. You might not see this problem on a daily basis; it’s one of those problems that slowly sneaks up on you until one day it’s too late to fix.

It’s happened many times before, just look at DEC (Digital Equipment Corporation), the hugely successful mid-size computer company back in the 80s which basically died overnight; they did not step outside their comfort zone and subsequently disappeared.

Kapow Software’s revolutionary innovation, the Kapow Extraction Browser (which by the way is the only purpose-build integration browser on the market), has proven to help hundreds of IT projects overcome these challenges and make an easy jump from the blue to the red area.

By leveraging the fact that all applications today either have an API or GUI, where data and transactions can be accessed and controlled, means you no longer are dependent on the data and application owners. That’s a big deal. No more rewriting of legacy apps and no more begging the application owner to provide APIs. Just go do it.

That is a new way of thinking and new paradigm. Are you ready for it?

Not only does it work, it’s proven by more than 500 customers worldwide. It’s also delivering incredible business value and business agility among your competitors.

Step out of your box and try Kapow Software yourself!

By: Stefan Andreasen Stefan Andreasen

Tagged with:          
Mar 25

Data assembly is now the biggest barrier to good analytics

Business Intelligence continues to become more and more strategic to companies in order to compete in today’s global economy. Every department is now using analytics to better understand financials, business processes, customers, competitors and market trends – critical understanding needed to optimize execution.

As we all know, analytics is no better than the data behind it, and thus discovery and assembly of data has become an ever more important part of successful business intelligence.

As your company ecosystem grows beyond your firewall into partner apps, competitor websites and social networks, data rapidly spreads and more and more data assembly is now tied up in manual harvesting methods or the purchase of dubious data from vertical information providers.

This means that the knowledge worker spends more time with Data Discovery and Data Assembly, leaving less time for analysis of and execution on the results.

I often see scenarios where knowledge workers spend more than 50% of their time on just data assembly, time which takes away from analysis, reporting and execution.

This is not good.

And it’s exactly why more companies rely on automating the data assembly process. Finding methods to easily and scalably instruct which data to get from where and how to transform it into the needed format – basically they look for a solution to do automated data delivery.

The good news is that this solution already exists. The Kapow Katalyst platform is proven by more than 500 companies all over the world.

Here’s a concrete example. Fiserv, a large financial services company, needed to understand the value of their assets in real-time for compliance reasons. To solve this problem the treasurer hired a group of people to manually log-in to Fiservs accounts spread over more than 300 banks in more than 20 countries. This was expensive, error-prone, and data was often outdated.

Consequently, Fiserv looked for an automated solution and found Kapow Katalyst. Within 3 months they had built Kapow ETL robots that could automatically log-in to the web front-end of Fiserv accounts at all 300 banks and pull out the required information. Not only did this relieve the knowledge workers from manual data assembly it also gave the treasurer real-time data for point-in-time regulatory compliance.

Needless to say this created a lot of value for Fiserv.

I recommend you read the whitepaper, Hyper Management of Working Capital, written by Thomas W. Warsop, Group President for Fiserv.

By: Stefan Andreasen Stefan Andreasen

Tagged with:             
Feb 07

Cloud Integration is more than just pre-existing connectors and SalesForce integration

As companies move their IT infrastructure and business applications to SaaS and the cloud it creates increasing need “cloud integration”, the ability to integrate data between applications in the hybrid world of internal apps, cloud apps and business partner apps.

Many Cloud Integration companies claim to offer a complete solution for integrating cloud and SaaS apps, but I claim they are all incomplete. Why?

If you check their demos and use-cases you quickly realize they provide a solution that only works if you have access to existing (and documented) web service APIs. And consequently, most of their examples are centered around salesforce.com integration and salesforce.com APIs.

This approach does not apply to the real cloud world – a world that is far from homogenous, but rather a hybrid, fragmented, and distributed world.

A cloud integration solution is only complete if it can integrate all applications in the cloud, whether they have Web Service APIs or not.

Today there are more than 200 million websites/applications on the internet, and many have complex features and data structure behind them. In this more holistic, big picture, only a fraction of these millions of websites are covered by documented APIs, making most traditional cloud integration solutions all but incomplete. The likelihood that your next cloud integration project will not be covered by “standard” connectors is very high, and therefore you need a cloud integration platform that can integrate to any layer in the application stack: database (SQL), web service (SOAP, REST), or through the presentation layer (HTML, AJAX).

Based on the patended Kapow Extraction Browser that leverages any HTML/AJAX application interface as an API if no web services API is present, Kapow Katalyst is the only Cloud Integration platform that has a complete data extraction, data integration, data transformation and data migration solution for your cloud integration challenges.

Proven by more than 500 customers world-wide, Kapow Katalyst is the only cloud integration platform that provides total connectivity and total data delivery in the cloud, in the enterprise and with your business partners.

Are you ready for your next cloud integration project?

By: Stefan Andreasen Stefan Andreasen

Tagged with:          

The Kapow Katalyst Blog is…

... a collection of insights, perspectives, and thought leadership around Application Integration.

Comments, Feedback, Contact Us:

blog at kapowsoftware.com

Get Our RSS Feed