Integration enables multi-vendor environments

In an effort to utilize the best software that addresses a variety of specific enterprise challenges, and to minimize the cost and time to maintain on-premises software deployments, more and more businesses are selecting best-of-breed SaaS applications for every functional challenge.

SAP to Workday Migration & Integration - Adessa Group

For human resources and finance software, many companies are choosing Workday’s Human Capital Management (HCM) Suite as an optimal solution. Workday is a SaaS application that depends on systems and applications within an organization — even those from different vendors — being able to communicate seamlessly with each other.

Businesses that use Workday HCM software need to be able to integrate Workday’s applications with enterprise resource planning (ERP) systems from different vendors, like Oracle, SAP, or PeopleSoft. In order to implement Workday as a solution, organizations need a seamless way to integrate Workday with an existing ERP.

The Workday HCM suite

Workdays HCM Online Training Suite covers an extensive number of applications that merge talent management, recruiting, payroll, benefits, financials, and more into a single platform. It includes everything from an applicant tracking system (ATS) to back office HR management software — and everything in between.

Workday recruiting software helps recruiters find, engage, and hire candidates as well as manage referrals, social media integration, offers, sourcing analytics, and onboarding. Its workforce management software assists with performance management, career development, compensation, and staff planning and organization.

Meanwhile, Workday’s back office management applications help organizations run their payroll and benefits operations and to manage their expenses.

Challenges of integrating Workday with ERP systems

Complex custom integrations

Creating connectivity and integration between HR applications and an ERP system is a common challenge for businesses. As recruiters find potential candidates and track applications, the information needs to be synchronized with databases and ERP systems such as SAP, Oracle, and PeopleSoft.

For businesses to operate efficiently, all of their systems, services, and applications need to be seamlessly integrated. Businesses often attempt to address this need by creating custom connectivity and implementing point-to-point integration.

This approach applies custom code directly between two endpoints, creating a tightly coupled dependency between them. While this type of integration is manageable in an environment with only a few systems, the complexity of custom integrations increases exponentially as additional systems and services are added over time.

Increased complexity, meanwhile, brings increased costs — as organizations are forced to divert ever more time and resources to building and maintaining integration in an increasingly fragile and inflexible architecture. Get more from Workday Integration Training

The need for a hybrid solution

As cloud-based mobile and SaaS applications become increasingly integral to business operations, the need for a hybrid integration platform that can bridge such applications to on-premises ERP systems becomes ever more critical. Businesses that opt for a best-of-breed SaaS strategy require a hybrid on-premises and in-the-cloud integration solution that can go wherever the very best systems and applications take them.

Navigating ERP system vendors

Another challenge organizations often face as they attempt to integrate Workday with their ERP systems is the often opaque structure of the ERP vendors themselves. Businesses often find it difficult to navigate the sprawling internal organizations of the most common ERP system providers in order to identify and locate the right group or person to go to with questions, requests for assistance, or concerns. This, in turn, complicates the already difficult task of integrating these very complex systems.

MuleSoft takes the pain out of integrating Workday with ERP software

With out-of-the-box connectivity, MuleSoft’s Anypoint™ Connectors make it easy to connect Workday to hundreds of popular on-premises and SaaS applications, like SAP, Oracle, PeopleSoft, NetSuite, Salesforce.com, and ServiceNow.

Users can easily build integrations within Anypoin Studio, MuleSoft’s graphical design environment, and deploy them directly either to Mule as an ESB or to CloudHub, a cloud-based integration platform as a service.

These and many other components are all a part of MuleSoft Anypoint Platform™and work together to provide businesses with a comprehensive integration solution, both on-premises and in the cloud. This next-generation platform provides a complete set of solutions to help businesses overcome the challenge of integrating such systems in a matter of days, not weeks or months. Get skills from Workday Studio Training

Moreover, MuleSoft has developed deep and close-knit relationships with major ERP system vendors, so we’re able to navigate their complex structures and collaborate with key partners to quickly deliver effective integration solutions.

Key Benefits: Connect and scale

With MuleSoft’s Anypoint Platform, you can quickly and efficiently build integrations to Workday that are proven to scale over time. Connect Workday to other SaaS and on-premises applications no matter where you are along the continuum toward full cloud migration.

With an underlying platform that contains a single development environment and reliable multi-tenant architecture, Anypoint Platform connects Workday and other Human Capital Management (HCM) applications to all your other third-party enterprise applications, whether they are in the cloud or behind your firewall. It also offers a future-proof solution that scales over time and will accommodate new features and upgrades to existing apps — and even new apps that have yet to be developed.

Connecting Workday gives you a single view of your employees from hire to retire, your HR system as the single source of truth for employee information, and the ability to provision and de-provision employees in real time. Anypoint Platform makes it possible to strengthen, extend, and scale your HR system across the enterprise by unlocking the back office and unwinding your legacy systems.

Connect Workday to the Enterprise
  • Extend Workday to the rest of the enterprise to connect new applications and unwind legacy systems
  • Solve your enterprise integration needs both in the cloud and on-premises with MuleSoft’s hybrid integration model.
  • Replace costly, custom point-to-point integrations.
  • Eliminate manual data entry and loss of information, while providing employee record consistency across systems
Get up and running quickly
  • Connect to applications instantly with our Anypoint Connector for Workday — fully supported and up to date with Workday API access to pre-built connectors to top HCM, CRM, and financial applications
  • Get up and running quickly with pre-built templates for HCM (eg. Workday to SAP Payroll) and ITSM (eg. Workday to ServiceNow)
Streamline employee management
  • Simplify data migration between Workday and your legacy HR or ERP system in real time or batch single view of employee data across HCM, financials, ERP, CRM, and SCM applications. Streamline management of employee data from HCM applications to downstream back office systems.
  • Optimize recruiting, onboarding, and retention efforts. Mobilize your back office data by connecting Workday solutions to custom-built mobile applications.

To get in-depth knowledge, enroll for a live free demo on Workday training

ServiceNow Integration with Azure DevOps (VSTS)

Integration overview

In an Application Lifecycle Management (ALM) ecosystem, the choice of systems and the collaboration between the cross-functional teams play a great role. While the choice of systems impacts the productivity of a team, the cross-functional collaboration helps the teams get complete context of the business requirements.


Best-of-breed systems such as Azure DevOps (VSTS) and ServiceNow bring rich functionalities to the ecosystem.

By integrating Azure DevOps with ServiceNow, enterprises can diminish collaboration barriers between development and customer service teams that otherwise lead to quality issues, delivery delays, and financial loss fore more Servicenow Certification

How Azure DevOps (VSTS) – ServiceNow integration is beneficial for an enterprise

  • Real-time access to customer issues and priorities
  • Communication on the workitems from the native systems itself
  • Real-time updates when a customer issue is resolved

With Azure DevOps (VSTS) + ServiceNow integration, enterprises can:

How OpsHub Integration Manager integrates Azure DevOps (VSTS) and ServiceNow

OpsHub Integration Manager integrates Azure DevOps and ServiceNow bi-directionally. It ensures that all historical and current data is available to each user, in that user’s preferred system, with full context, in real-time. All ‘tickets’ from ServiceNow automatically synchronize to Azure DevOps and all the entities and details associated with the ‘tickets’ synchronize back to Azure DevOps.

Use Case: Azure DevOps (VSTS) integration with ServiceNow

Problem statement: The support team receives a ticket from a customer, identifies it as a ‘problem’, and shares the details of the ‘problem’ with the development team via email. Three days later, a support team representative writes a follow-up email to the development team to check the status of the ‘problem’.

The development team, then, updates the support team representative that the ‘problem’ was resolved two days back. Learn more Servicenow Online Training

Solution: When Azure DevOps and ServiceNow are bi-directionally integrated using OpsHub Integration Manager, the status of the ‘problem’ would automatically change in ServiceNow as soon as its status is changed to ‘resolved’ in Azure DevOps.

  1. Multiple customers log tickets in ServiceNow citing similar Skype-related issues.
  2. The support team identifies these tickets and reports them as a ‘problem’ to the backend team.
  3. OpsHub Integration Manager synchronizes this ‘problem’ as a ‘bug’ in Azure DevOps.
  4. The backend team examines the defect and requests the service team for more information to resolve the bug.

To get in-depth knowledge, enroll for a live free demo on Servicenow Training

how Mule’s future will be shaped by its past

Understanding MuleSoft’s past offering, will help us better appreciate its foundational strengths and substantiate why MuleSoft is set to become real strategic player in the integration space in 2020.

ESB (the integration term that should not be named)

I have been fortunate enough to work in the integration space, and in particular within MuleSoft API-led integration, since its early years of industry adoption. In those days, Mule was positioned as an ESB* – a term which has since achieved Voldemort status – ‘the integration term that shall not be named.’

This is because in the last 5 years we have seen MuleSoft make the paradigm shift to API-led connectivity and away from ESB-led connectivity – mirroring the behaviour of the industry as a whole. For more details Mule Training

Because, as enterprises look to accelerate their digital journeys, APIs offer more flexibility to connect the fast pace of digital innovation as compared to ESBs, which are used to integrate various systems of record, for stable, well-understood business processes. 

As Ross Mason (Founder) explained at MuleSoft Connect in October: “it’s not about the big eating the small, it’s about the fast eating the slow.” In the tech industry, no matter your size, you must innovate fast or die – ESB models were out and API-led connectivity was in, so MuleSoft made the move.

How do current MuleSoft customers feel about this move?

As a Business Development Manager at WHISHWORKS, I have the privilege of speaking with very experienced individuals in their fields – both from technical and business tracks.

These professionals who are embarking on API-led transformation, have explained that their experience with Mule has typically been truly progressive, but also encompassing a huge learning curve.

This is because with the adoption of any new tech, especially in its infancy, there is a certain amount of ‘skilling-up’ that is required. This is not necessarily in its core language – in this case predominantly Java – rather, the best practises surrounding it. This is particularly true in the new MuleSoft API-led view where DevOps are so closely linked. Learn more skills from mulesoft Online Training

As a result, I have noticed a considerable increase in conversations surrounding best practise, governance and correctly leveraging a C4E (Centre for Excellence) – something I will be going into more detail on in my next post.

How can embracing Mule’s ESB roots drive API-led solutions in 2020?

Greater demand from businesses for bigger and better IT solutions cannot be supported by IT budgets that, on the whole, remain stagnant and are simply used to keep the “lights on.” This results in little innovation with prolonged product release cycles.

Yes, DevOps and Agile are a start to tackle incumbent problems, but are not enough to deal with the growing demand and dwindling resources.

The solution? A new operating model with a productised API at its heart and the understanding of the positive outcomes of an ESB foundation – the cumulative efforts which have resulted in a matured community and maturing business and project practises. Get more skills from Mule 4 Training

In short, if we treat APIs as business assets which are made reusable and available through an exchange, we essentially enable lines of business within any organisation to self- serve. The Mule narrative makes it clear that product is only half of the battle, the consumption of said assets is just as vital.

Final thoughts

With the desire among customers to press innovation forward, we should be retrospectively looking at projects to see how we could do them better in the future. The increase in re-rationalising APIs, ensuring they are correctly structured and leveraging the three-tiered approach is crucial to the entire story.

*According to MuleSoft, an Enterprise Service Bus (ESB) is fundamentally an architecture. It is a set of rules and principles for integrating numerous applications together over a bus-like infrastructure.

ESB products enable users to build this type of architecture, but vary in the way that they do it and the capabilities that they offer. The core concept of the ESB architecture is that you integrate different applications by putting a communication bus between them and then enable each application to talk to the bus.

This decouples systems from each other, allowing them to communicate without dependency on or knowledge of other systems on the bus. The concept of ESB was born out of the need to move away from point-to-point integration, which becomes brittle and hard to manage over time.

To get in-depth knowledge, enroll for a live free demo on Mulesoft Training

Workday makes its big analytics bet, launches Prism Analytics, data-as-a-service, benchmarking

Workday has integrated the technology behind Platfora and is betting that the ability to analyze new data sources will complement its people and financial information.

Workday is all-in on analytics as it launches Prism Analytics, a data-as-a-service effort and aims to use its platform to become the front-end to other enterprise systems.

At Workday Rising, the company fleshed out Prism Analytics, which is based on the acquisition of Platfora. Workday spent much of the last year integrating Platfora onto one code base instead of bolting systems together in a what would be another addition to a cloud menu. For more details Tableau Course

When Prism is coupled with Workday’s move to open its cloud platform to developers, it’s clear that the company is looking to gain more enterprise data under management. “Prism will work with Workday and non-Workday data sets,” said Dan Beck, senior vice president, technical platform. “Workday will also bring in other operational data.”

Visualization tools were added in Workday 29 and the Workday 30 Spring release will have more tools that utilize in-memory and benchmarking.

These efforts will combine with a data-as-a-service play that will utilize Workday’s HR and financial customer base data–anonymized–for benchmarking information.

While Workday’s efforts should fare well, it’s worth noting that most cloud vendors are offering analytics and connections to datasets across an enterprise. Salesforce has its Wave Analytics coupled with Einstein. For more info Workday HCM Online Training

Oracle, Microsoft and SAP also have analytics efforts. And then there are pure plays such as Tableau, SAS and Qlik. What’s unclear is what analytics platform can really win when most cloud vendors have their own spin.

  • Workday’s analytics makes sense in that it will start focused on its own customers. Beck said Prism and Workday’s benchmarking service will likely be add-ons for customers that are already in with human capital and financial software.
  • “We will put this all together in one system to plan, process and analyze and then extend the cloud platform,” said Beck.

PUTTING IT ALL TOGETHER

  • Workday acquired Platfora in June 2016, outlined integration plans at Workday Rising that September and this year is delivering at its powwow.
  • Pete Schlampp, vice president of Workday Analytics via the Platfora purchase, said the integration on the back end went well since both companies were built on Hadoop and Spark and big data approaches.
  • The rest of Platfora’s technology adopted everything from Workday’s security approach to data schema to design language. Platfora’s key engines–data prep, integration, analytics and visualization–were rebuilt from scratch.
  • “Beginning in 2016, the companies started talking about what Workday customers were doing and how they wanted to analyze data about their people and financials in context of third party data. But data was in silos across the premises,” said Schlampp. More from Workday Course
  • These customer needs weren’t that surprising. A hospitality company may have data in a point of sale system that it needs to connect to people and financials in a Workday system.
  • “Prism is designed to connect and synchronize in the Workday cloud and keep it secure,” said Schlampp. “This is not Platfora reskinned. Workday talks about the power of one and not single products.”
  • Workday will build out the Prism partnership and integration roster.
  • Will Prism Analytics be the forerunner to HCM and financials in some deals? Schlampp said the core focus is existing Workday customers that see analytics as a differentiator. “Analytics is more important for customers. Customers don’t make analytics decisions. They make better business decisions.

To get in-depth knowledge, enroll for a live free demo on Tableau Course

The New Database Connector in Mule 4

There’s a brand new Mule 4 Anypoint Connector for Database (DB) that you can use to connect to any relational Database engine. Unlike other connectors such as File or FTP, this connector has a pretty similar UX compared to the one in Mule 3.x, with some considerable improvements, including:

– Improved operations: Operations are streamlined and now simpler to use. We’ve extracted bulk functionality to their own set of operations so that you no longer have operations that change behaviors depending on the received payload

– Dynamic queries simplified: Now there’s a single experience for executing static and dynamic queries. For more info Mule Training

– Embeddable transformations: you can now embed DataWeave transformations inside the insert/update operations so that you can construct the datasets you want to send to the DB without having a side effect on the message or using enrichers

 Streaming simplified: you no longer have to worry about configuring streaming on your operations. The connector will use Mule’s new streaming framework to handle that automatically.

You can now even execute a select statement and process the results asynchronously without worrying about leaking connections!

Let’s take a quick tour and see what’s new.

Select

The select operation is used to retrieve information from the RDBMS. The primary concept of this operation is that you will supply a SQL query and use DataWeave to supply the parameters:

<flow name="selectParameterizedQuery">
  <db:select config-ref="dbConfig">
    <db:sql>select * from PLANET where name = :name</db:sql>
    <db:input-parameters>
      #[{'name' : payload}]
    </db:input-parameters>
  </db:select>
</flow>

As you can see in the above example, input parameters are supplied as key-value pairs, which we can now create by embedding a DataWeave script. More from Mule 4 Training

Those keys are used in conjunction with the semicolon character (:) to reference to a parameter value by name. This is the recommended approach for using parameters in your query. The advantage of doing it this way are:

  • The query becomes immune to SQL injection attacks
  • The connector can perform optimizations that are not possible otherwise, which improves the application’s overall performance

Dynamic Queries

Sometimes, you not only need to parameterize the WHERE clause, but also parameterize parts of the query itself. Example use cases for this would be queries which need to hit online/historic tables depending on a condition, or complex queries where the project columns need to vary.

In Mule 3, the concept of select was split in parameterized and dynamic queries, and you couldn’t use both at the same time. You had to choose between having a dynamic query or having the advantages of using parameters (SQL Injection protection, PreparedStatement optimization, etc.). Furthermore, the syntax to do one or the other was different, so you had to learn two different ways of doing the same thing.

But with the new Database Connector in Mule 4, you can now use both methods at the same time by using expressions in the query. In this example, you can see how a full expression is used to produce the query by building a string in which the table depends on a variable. An important thing to notice is that although the query text is dynamic, it is still using input parameters:

<set-variable variableName="table" value="PLANET"/>
<db:select config-ref="dbConfig">
 <db:sql>#["SELECT * FROM $(vars.table) WHERE name = :name"]</db:sql>
 <db:input-parameters>
   #[{'name' : payload}]
 </db:input-parameters>
</db:select>

Why do I need dynamic queries at all for the example above? Can I just treat the table like another Input Parameter? Then answer is no. Input Parameters can only be applied to parameters in a WHERE clause. To modify any other part of the query, you need to use DW’s interpolation operator. For more skills Mulesoft Online Training

Streaming large results

Database tables tend to be big. One single query might return tens of thousands of records, especially when dealing with integration use cases. Streaming is a great solution for this. What does streaming mean? Suppose you have a query which returns 10K rows, attempting to fetch all those rows at once will result in the following:

  • Performance degradation, since that’s a big pull from the network
  • A risk of running out of memory, since all that information needs to be loaded into RAM.

Streaming means that the connector will not fetch the 10K rows at once; instead, it will fetch a smaller chunk, and once that chunk has been consumed it will go fetch the rest. That way, you can reduce pressure over the network and memory.

Streaming in Mule 4 vs Streaming in Mule 3

In Mule 3.x this was something you had to specifically enable because it was disabled by default. In Mule 4, this is transparent and always enabled, you don’t have to worry about it anymore, you can simply trust that the feature is there.

Another improvement from Mule 3 is that we can now use the new repeatable streams mechanism from Mule 4. That means that streams are now repeatable and you can have DataWeave and other components process the same stream many times, even in parallel. Get more from mulesoft Certification

Insert, Update and Delete

The insert, update, and delete operations were also upgraded in the same way. You can use DataWeave parameters and get dynamic queries as well:

	<db:insert config-ref="dbConfig">
	  <db:sql>
	    INSERT INTO PLANET(POSITION, NAME, DESCRIPTION) VALUES (777, 'Pluto', :description)
	  </db:sql>
	  <db:input-parameters>
	    #[
	    {'description' : payload}
	    ]
	  </db:input-parameters>
	</db:insert>

<db:delete config-ref="dbConfig">
	  <db:sql>
	    DELETE FROM PLANET where POSITION = :position
	  </db:sql>
	  <db:input-parameters>
	  #[
	    {'position' : 7}
	  ]
	  </db:input-parameters>
	</db:delete>

Bulk Operations

The inset, update, and delete operations we saw above are fine for the cases in which each input parameter can take only one value.

For example, when deleting, many rows could match the criteria and get deleted, but only one criteria (POSITION = X) is provided. The same concept applies for update, if you do UPDATE PRODUCTS set PRICE = PRICE * 0.9 where PRICE > :price, you may be applying a 10% discount on many products, but the price input parameter will only take one value.

What happens if we want to apply different discount rates on products that have different prices? Well, we could do it by executing many operations. For example, assume you have a payload which is a list of object of the following structure { price : number, discountRate: number}, then, we could do this:

  <foreach>
  <db:update config-ref="dbConfig">
    <db:sql>
      UPDATE PRODUCTS set PRICE = PRICE * :discountRate where PRICE > :price
    </db:sql>
    <db:input-parameters>
     #[
      {
        'discountRate' : payload.discountRate,
        'price' : payload.price,
      }
    ]
    </db:input-parameters>
  </db:update>
</foreach>

That method would certainly get the job done; however, it is highly inefficient. One query needs to be executed per each element in the list. That means that for each element we will have to:

  • Parse the query
  • Resolve parameters
  • Grab a connection to the DB (either by getting one for the pool or establishing a new one)
  • Pay all the network overhead
  • The RBMS has to process the query and apply changes
  • Release the connection. Learn more advanced concepts from Mule 4 Certification

You can avoid all of the above steps by doing a bulk operation. When you look at it, there’s only one query here; the update statement is constant, not dynamic. The only thing that changes is that, on each iteration, we supply a different set of parameters.

Bulk operations allow you to do exactly that: to run one single query using a set of parameters values. Make no mistake though, this is not just a shortcut for the same <foreach> above, this uses features on the JDBC API so that:

  • Query is parsed only once
  • Only one DB connection is required since a single statement is executed
  • Network overhead is minimized
  • RBDMS can execute the bulk operation atomically

For these use cases, the connector offers three operations, <bulk-insert><bulk-update> and <bulk-delete>.

These are pretty similar to their single counterparts, except that instead of receiving input parameters as key-value pairs, they expect them as a list of key-value pairs.

Let’s look at an example:

	<db:bulk-insert config-ref="dbConfig" >
	  <db:sql>
	    insert into customers (id, name, lastName) values (:id, :name, :lastName)
	  </db:sql>
	  <db:bulk-input-parameters>
	    #[[{'id': 2, 'name': 'George', 'lastName': 'Costanza'}, {'id': 3, 'name': 'Cosmo', 'lastName': 'Kramer'}]]
	  </db:bulk-input-parameters>
	</db:bulk-insert>

To get in-depth knowledge, enroll for a live free demo on Mulesoft Training

How Machine Learning and Adaptive Methods Are Revolutionizing Integration

For years, those of us in the technology sectors have been building integrations between disparate systems. In fact, enterprise organizations often have specific resources dedicated toward building and maintaining integrations between mission critical systems.

According to dictionary.com, the word integrate is defined as a verb meaning “to bring together or incorporate (parts) into a whole.” For decades, technologists have been manually creating integrations between systems whose interfaces have been continuously changing. For more info Mulesoft Training

Those of us who were lucky got the opportunity to create an integration between a mainframe and a front-line database, thus ensuring that fewer updates to the integration were necessary.

However, for those of us who had to create integrations between the multitude of SaaS services on the market, we determined that a frequent maintenance plan and update cycle were an absolute requirement.

As years have gone by, integration has often been treated as a synonym for synchronization.

While keeping data in sync between multiple databases and platforms is a key part of the integration process, it’s not nearly the totality of the integration problems organizations often face. Focusing primarily on synchronization leaves out the most important part: automation. 

Automating different components of business has always been approached with trepidation. Learn more skills from Mule Training

While there are a number of things that could easily become automated (those rote tasks performed 1,000 times a day), more core components of the business could not benefit from automation due to the analysis required in decision making.

It’s time that we as technologists take a cue from another part of the industry: natural language processing (NLP).

An NLP uses the concept of intents to make an educated guess about what a person is trying to say and what the resulting action should be. For example, when I speak to an automated system at an airline, I may say “I need to know if my flight is on time.”

That phrase will then get passed through the NLP and compared against several different intents in their system.

Other examples of intents for an airline may include “Speak to a representative,” “Purchase a ticket,” or “Get updated flight information.”

In the specific case of my first example, the intent would map to “get updated flight information.” Get more from Mulesoft Certification

Once the intent is known, we can map specific actions. In this example, the action would be a response with the current status of the flight. 

The next major step forward in integration technology will take its cue from those that have come before. Utilizing some of the new techniques being developed by iPaaS providers in concert with machine learning techniques such as the example given above, we can come close to the holy grail of integration requirements.

An integration developer should be able to explain the ultimate intent for the integration and have an intelligent system create a blueprint for review.

Following a successful review, the same system should be able to deploy the integration and manage any changes to the underlying systems. 

Imagine being able to work visually with an environment that allows me to explain my intent. Learn more advanced skills from Mule 4 Training

For example, every time a lead comes into Marketo with a certain lead score, I need to automatically contact the head of sales and initiate marketing automation.

To give general intent and then receive a blueprint on the execution by connecting different API endpoints (from Marketo, Salesforce, and Microsoft Outlook) would be just the start of a world of possibilities for automation that creates and maintains integrations.

We’re currently looking down the barrel of the Internet of Things, but we’re still approaching integration in traditional ways.

Artificial intelligence combined with new tools for building integrations can bring about that ever-elusive goal: adaptive and intelligent integration.

Building a system that maintains integrations based upon the ultimate intent as opposed to some rigid constraints will be a necessity when attempting to seamlessly link platforms that manage thousands (if not millions) of devices.

The reason for pushing automated integration creation and management is to ensure we’re ready for the incredible amount of data to be processed from the oncoming IoT storm. However, it will have tremendous value elsewhere as well.

To get in-depth knowledge, enroll for a live free demo on Mulesoft Online Training

The Role of Traditional ETL in Big Data

ETL tools combine three important functions (extract, transform, load) required to get data from one big data environment and put it into another data environment.

Traditionally, ETL has been used with batch processing in data warehouse environments.

Data warehouses provide business users with a way to consolidate information to analyze and report on data relevant to their business focus. ETL tools are used to transform data into the format required by data warehouses.

The transformation is actually done in an intermediate location before the data is loaded into the data warehouse. Many software vendors, including IBM, Informatica, Pervasive, Talend, and Pentaho, provide ETL software tools. For more ETL Testing Trianing

ETL provides the underlying infrastructure for integration by performing three important functions:

  • Extract: Read data from the source database.
  • Transform: Convert the format of the extracted data so that it conforms to the requirements of the target database. Transformation is done by using rules or merging data with other data.
  • Load: Write data to the target database.

However, ETL is evolving to support integration across much more than traditional data warehouses.

ETL can support integration across transactional systems, operational data stores, BI platforms, MDM hubs, the cloud, and Hadoop platforms.

ETL software vendors are extending their solutions to provide big data extraction, transformation, and loading between Hadoop and traditional data management platforms.

ETL and software tools for other data integration processes like data cleansing, profiling, and auditing all work on different aspects of the data to ensure that the data will be deemed trustworthy.

ETL tools integrate with data quality tools, and many incorporate tools for data cleansing, data mapping, and identifying data lineage. With ETL, you only extract the data you will need for the integration. Learn more from ETL Online Course

ETL tools are needed for the loading and conversion of structured and unstructured data into Hadoop.

Advanced ETL tools can read and write multiple files in parallel from and to Hadoop to simplify how data is merged into a common transformation process.

Some solutions incorporate libraries of prebuilt ETL transformations for both the transaction and interaction data that run on Hadoop or a traditional grid infrastructure.

Data transformation is the process of changing the format of data so that it can be used by different applications. This may mean a change from the format the data is stored in into the format needed by the application that will use the data.

This process also includes mapping instructions so that applications are told how to get the data they need to process.

The process of data transformation is made far more complex because of the staggering growth in the amount of unstructured data. A business application such as a customer relationship management has specific requirements for how data should be stored.

The data is likely to be structured in the organized rows and columns of a relational database. Data is semi-structured or unstructured if it does not follow rigid format requirements. Learn more skills from ETL Testing Certification

The information contained in an e-mail message is considered unstructured, for example. Some of a company’s most important information is in unstructured and semi-structured forms such as documents, e-mail messages, complex messaging formats, customer support interactions, transactions, and information coming from packaged applications like ERP and CRM.

Data transformation tools are not designed to work well with unstructured data. As a result, companies needing to incorporate unstructured information into its business process decision making have been faced with a significant amount of manual coding to accomplish the required data integration.

Given the growth and importance of unstructured data to decision making, ETL solutions from major vendors are beginning to offer standardized approaches to transforming unstructured data so that it can be more easily integrated with operational structured data.

Big data is most useful if you can do something with it, but how do you analyze it? Companies like Amazon and Google are masters at analyzing big data. And they use the resulting knowledge to gain a competitive advantage.

Just think about Amazon’s recommendation engine. The company takes all your buying history together with what it knows about you, your buying patterns, and the buying patterns of people like you to come up with some pretty good suggestions. It’s a marketing machine, and its big data analytics capabilities have made it extremely successful.

The ability to analyze big data provides unique opportunities for your organization as well. You’ll be able to expand the kind of analysis you can do. Instead of being limited to sampling large data sets, you can now use much more detailed and complete data to do your analysis. However, analyzing big data can also be challenging. Changing algorithms and technology, even for basic data analysis, often has to be addressed with big data.

The first question that you need to ask yourself before you dive into big data analysis is what problem are you trying to solve? You may not even be sure of what you are looking for.

You know you have lots of data that you think you can get valuable insight from. And certainly, patterns can emerge from that data before you understand why they are there.

If you think about it though, you’re sure to have an idea of what you’re interested in.

For instance, are you interested in predicting customer behavior to prevent churn? Do you want to analyze the driving patterns of your customers for insurance premium purposes?

Are you interested in looking at your system log data to ultimately predict when problems might occur? The kind of high-level problem is going to drive the analytics you decide to use.

Alternately, if you’re not exactly sure of the business problem you’re trying to solve, maybe you need to look at areas in your business that need improvement. Even an analytics-driven strategy — targeted at the right area — can provide useful results with big data.

When it comes to analytics, you might consider a range of possible kinds, which are briefly outlined in the table.

To get in-depth knowledge, enroll for a live free demo on ETL Testing Online Training

Mulesoft Anypoint Connector DevKit

The Anypoint Connector DevKit, or simply DevKit, enables the development of Anypoint Connectors. An Anypoint Connector is an extension module to the MuleSoft Anypoint Platform that facilitates communication between third-party systems/APIs and Mule applications.

Developing Connectors with DevKit

This is what you need to develop DevKit-based Anypoint Connectors on your system with your instance of Anypoint Studio.

  1. See detailed instructions here on how to install: Java JDK version 8, Apache Maven, Anypoint Studio, and Anypoint DevKit Plugin to build and test your connector. You can develop a connector using Windows, Mac, or Linux.
  2. New Connector:
    1. Create Anypoint Connector Project. To get more info Mulesoft Training

Existing Connector:

  • Click File > Import > Anypoint Studio > Anypoint Connector Project from External Location, choose a URL or a .zip file, and complete the wizard to locate and import the project.

See also Creating a SOAP Connector.

  • Determine resource access – Each resource has a different access method, such as REST, SOAP, FTP, or the Java SDK features.
  • Choose an authentication mechanism – Mule supports OAuth V1 or V2, and username and password authentication (known as connection management), which can be used for protocols such as API Key, SAML, NTLM, Kerberos, or LDAP.
  • Choose the connector’s data model – Models can be static Java objects or dynamic objects. You can use DataSense – Determine what information the target resource expects.
  • Add connector @ attribute annotations – Create code for your connector containing the @ attributes that Mule uses to designate the important parts of your connector.
  • Code tests – Tests can be unit tests, functional tests, and Studio interoperability tests.
  • Document your connector – MuleSoft provides a template that helps you fill in the blanks to create documentation to help your staff and others understand the features and use of your connector.
  • Package your connector. Learn more skills from Mule Training

DevKit Features

Features DevKit provides:

  • Visual design and implementation using Anypoint Studio with an Eclipse-based interface that simplifies and speeds up development.
  • Maven support.
  • Connector packaging tools.
  • Authentication support for multiple types of authentication, including OAuth and username and password authentication.
  • DataSense support to acquire remote metadata.
  • Extensive testing capability.
  • Examples, training, and support to simplify development startup.
  • Batch, Query Pagination, and DataSense Query Language support.

DevKit is a annotations-based tool, with a wide set of available annotations to support its features.

What is a Connector?

An Anypoint Connector is an extension module that eases the interaction between a Mule application and external resources, such as databases or APIs, through REST, SOAP, or the Java SDK.

As reusable components that hide API complexity from the integration developer, custom connectors facilitate integration with SaaS and on-premises web services, applications, and data sources.

Connectors built using Anypoint DevKit in Anypoint Studio, running Mule runtime environments, act as extensions of Anypoint Platform. Get from mule 4 Training

Connector Architecture

Connectors operate within Mule applications, which are built up from Mule Flows, and external resources, which are the targeted resources.

A Mule connector has two operational sides. The Mule-facing side communicates with a resource’s target-facing client side to enable content to travel between the Mule applications, and the external target-facing resource.

Mule-Facing Functionality

From the Mule-facing side, a connector consists of:

  • Main Java class. Java code that you annotate with the @Connector attribute.
  • Connector attributes. Properties of the @Connector class that you annotate with the @Configurable attribute.
  • Methods. Functionality that you annotate with the @Processor attribute.

Additional annotations define authentication-related functionality, such as connection management. Annotations allow you to control the layout of the Anypoint Studio dialogues for the connector as well. The data model and exceptions that either raise or propagate are also Mule-facing classes.

DevKit generates a scaffold connector when you create your Anypoint Connector project in Studio. This scaffold connector includes the @Connector class, the @Configurable attributes, the @Processor methods, and authentication logic to build out your connector.

Target-Facing Functionality

The target facing or client facing side of a connector depends on the client technology that enables access to the resource. This functionality consists of a class library and one or more classes that @Connector classes use to access client functionality. This functionality is called the client class.

The client class in turn generally depends on other classes to actually implement calls to the targeted resource. Depending on your target, some of these classes may be generated or provided for you.

For example, if you have a Java client library, or are working with a SOAP or REST services, most of the client code is implemented there. In other cases, you have to write the code yourself. Learn more skills from Mulesoft Certification

Coding a Connector

DevKit lets you build connectors from scratch. Before creating your own connector, check the Anypoint Exchange for available connectors. The connectors page also lists Community open source connectors that let you contribute to the growing community of public connector development.

Connector Data Model

The data model for the connector consists of the objects passed into and out of the exposed operations. While many Web services accept and return XML or JSON data, a proper Mule connector must translate the data format the client uses into Java objects – either POJOs or key-value maps which represent the data objects sent to, and returned from, the target. (Returning raw XML or JSON responses to Mule is one marker for an immature, improperly implemented connector.)

REST Versus SOAP

REST simplifies access to HTTP using POST, GET, PUT, and DELETE calls to provide access to creating, getting, putting, and deleting information on a resource.

SOAP is a traditional means of communicating with a resource and requires a WSDL file, which is an XML file that specifies all aspects of a Java class’s structure, methods, properties, and documentation. SOAP is an industry standard with tools for governance, building, and schema information. DevKit provides a tools that helps building a connector using a WSDL file.

DevKit 3.9 Default Connector Project Classes

The following is an example of the starting @Connector and @Configuration classes that DevKit 3.9 creates:

package org.mule.modules.newconnector;

import org.mule.api.annotations.Config;
import org.mule.api.annotations.Connector;
import org.mule.api.annotations.Processor;

import org.mule.modules.connpom.config.ConnectorConfig;

@Connector(name="connpom", friendlyName="Connpom")
public class ConnpomConnector {

    @Config
    ConnectorConfig config;

    /**
     * Custom processor
     *
     * @param friend Name to be used to generate a greeting message.
     * @return A greeting message
     */
    @Processor
    public String greet(String friend) {
        /*
         * MESSAGE PROCESSOR CODE GOES HERE
         */
        return config.getGreeting() + " " + friend + ". " + config.getReply();
    }

    public ConnectorConfig getConfig() {
        return config;
    }

    public void setConfig(ConnectorConfig config) {
        this.config = config;
    }

}
The DevKit 3.9 @Configuration class is as follows:
package org.mule.modules.newconnector.config;

import org.mule.api.annotations.components.Configuration;
import org.mule.api.annotations.Configurable;
import org.mule.api.annotations.param.Default;

@Configuration(friendlyName = "Configuration")
public class ConnectorConfig {

    /**
     * Greeting message
     */
    @Configurable
    @Default("Hello")
    private String greeting;

    /**
     * Reply message
     */
    @Configurable
    @Default("How are you?")
    private String reply;

    /**
     * Set greeting message
     *
     * @param greeting the greeting message
     */
    public void setGreeting(String greeting) {
        this.greeting = greeting;
    }

    /**
     * Get greeting message
     */
    public String getGreeting() {
        return this.greeting;
    }

    /**
     * Set reply
     *
     * @param reply the reply
     */
    public void setReply(String reply) {
        this.reply = reply;
    }

    /**
     * Get reply
     */
    public String getReply() {
        return this.reply;
    }

}

DevKit 3.9 Default pom.xml

The pom.xml file for a DevKit 3.9 project. The <parent> section shows DevKit’s group ID org.mule.tools.devkit.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

    <modelVersion>4.0.0</modelVersion>
    <groupId>org.mule.modules</groupId>
    <artifactId>newconnector-connector</artifactId>
    <version>1.0.0-SNAPSHOT</version>
    <packaging>mule-module</packaging>
    <name>Mule Newconnector Anypoint Connector</name>

    <parent>
        <groupId>org.mule.tools.devkit</groupId>
        <artifactId>mule-devkit-parent</artifactId>
        <version>3.9.0</version>
    </parent>

    <properties>
        <category>Community</category>
        <licensePath>LICENSE.md</licensePath>
        <devkit.studio.package.skip>false</devkit.studio.package.skip>
    </properties>
    <repositories>
        <repository>
            <id>mulesoft-releases</id>
            <name>MuleSoft Releases Repository</name>
            <url>http://repository.mulesoft.org/releases/</url>
            <layout>default</layout>
        </repository>
    </repositories>
</project>

To get in-depth knowledge, enroll for a live free demo on Mulesoft Online Training

Using the CloudHub API

The CloudHub REST API enables you to programmatically access these functions of Runtime Manager:

  • Create an application on CloudHub.
  • Deploy a new version of your application.
  • Change the application properties, including the number of workers and environment variables.
  • Get statistics about your application.
  • Create CloudHub notifications.
  • Create email alerts triggered by your applications.
  • Delete your application.

For an interactive reference that includes supported resources, methods, required properties, and expected responses.

The CloudHub API only manages applications deployed to the cloud-based version of Runtime Manager. To manage on-prem applications using an API,

Getting Started and Authenticating with the API

Before getting started, familiarize yourself with operations for applications.

You can use any HTTP client with the CloudHub API. With Java, use the Jersey client or HttpClient with Jackson for JSON support.

To access the CloudHub API, first authenticate with Anypoint Platform. To authenticate, use the Access Management API. Authentication requires that you supply an access token in the Authorization header.

Your username specifies the environment to access and is in the form “user@environment”. For example, if your username is “jane” and the environment is “Development”, your username is jane@Development. If you don’t specify an environment, the API defaults to Production. Learn from Mule 4 Online Training

Data Format

Resources and methods that return or accept a type use the JSON data format. Here is an example of data received in JSON format in response to a request to get an application:

{
  "domain":"hello",
  "fullDomain":"hello.cloudhub.io",
  "workers":1,
  "hasFile":false,
  "muleVersion": "3.1.2",
  "properties": {
    "foo":"bar"
  },
  "status":"STARTED",
  "workerStatuses":[
     {
       "id":"",
       "host":"xxx.xxx.xxx.xxx",
       "port":8081,
       "status":"STARTED"
     }
   ]
}

Cloudhub API

The Cloudhub Public API allows you to deploy and manage applications in CloudHub. You can deploy your application, manage schedules and queues within an application, and view logs pertaining to an application. You can also view memory and cpu usage and get statistics about the Mule messages sent using the application.

Overview

You can access the API through an interface provided by the Runtime Manager. The Runtime Manager UI enables access to all of the API endpoints.

The Cloudhub Public API provides a one month retention policy for statistics data. It also enforces a usage limit of 250 connections per application and 500 requests per IP address. Get more from Mulesoft Training

To access this API you must send a request with a valid token. See the Getting Started section to understand how to obtain a token and use it in this API.

To access the API you can use the curl command to make calls to the endpoint directly while supplying the access token and authorization header.

Getting Started

To start using the API, the you need to authenticate using Anypoint Access Management and select Runtime Manager after logging in. You can also access the API by including the access token obtained by authenticating via Anypoint Access Management in every request.

Example API Calls

The following example shows how to retrieve schedules for an application.

Request:
 
curl -X GET \
-H 'Authorization:bearer ea27ee48-43d3-4d63-a71c-e7cf3c9ff167' \
-H 'X-ANYPNT-ENV-ID:3a654a5a-dc7f-4b39-997e-55dbd9675bb7' \ 
-H 'X-ANYPNT-ORG-ID:f1f2a7eb-995f-456f-a295-62a9ec9285da' \
'https://anypoint.mulesoft.com/cloudhub/api/applications/stats-gath/schedules?_=1517351860955'
 
Response:
[
 {
  "id":"5a0f6f0696c73657e379c99b_vmpersistentqueue_pocFlow1_polling_vmpersistentqueue_pocFlow1_1",
  "flow":"vmpersistentqueue_pocFlow1","name":"vmpersistentqueue_pocFlow1 Poll",
  "href":"wss://anypoint.mulesoft.com/api/applications/stats-gath/schedules/5a0f6f0696c73657e379c99b_vmpersistentqueue_pocFlow1_polling_vmpersistentqueue_pocFlow1_1",
  "lastRun":"2018-01-27T06:28:02.483Z",
  "enabled":true,
  "status":"IDLE",
  "schedule":{"timeUnit":"seconds","period":1},
  "run-href":"wss://anypoint.mulesoft.com/api/applications/stats-gath/schedules/5a0f6f0696c73657e379c99b_vmpersistentqueue_pocFlow1_polling_vmpersistentqueue_pocFlow1_1/run"
 }
]

Access Management API

The Anypoint Access Management API enables you to access administrative functionality of Anypoint Platform, including:

  • User Management
  • Client Management
  • Invitation and Signup
  • Organizations and Business Groups
  • Roles and Permissions
  • Environments
  • Entitlements

Getting Started

To begin using the Access Management API, you obtain an access token.

Support

If the Access Management API is not working, contact MuleSoft and provide information about this issue.

If possible, provide the following information to have a better understanding of the situation.

  • The operation that is not working correctly and the associated request.
  • Steps to reproduce the issue.

To get in-depth knowledge, enroll for a live free demo on Mulesoft online Training

DevOps integration with ServiceNow

DevOps is an integration of software development and IT operations. It aims to improve the relationship between them with better communication and collaboration. It includes automation between these business units to deliver software with speed and good quality. The integration of DevOps with ServiceNow will extend the collaboration between the development teams and the customers.

ServiceNow is a software platform that supports IT Services and also helps to automate IT business management. It is a cloud-based platform. It is very flexible and powerful to deliver services as well as assure service availability. Furthermore, DevOps integration will be helpful to resolve customer issues faster and also it prioritizes customer service for both the teams. For more details Servicenow Online Training

ServiceNow Devops module

The integration brings Rich functionalities to the ecosystem. It changes the customer service platform with more feasibility. It provides real-time visibility to the development teams. Moreover, it supports cross-functional collaboration.

Here, the role of an Integration manager becomes important. He is responsible for the integration of both platforms. This integration ensures that all existing and historical data will be available to the user. Here, all the ServiceNow tickets will synchronize with DevOps.

There are many DevOps automation tools available that help the team to maintain large IT infrastructure with more agility. These are AWS, Chef, Jenkins, Slunk, App Dynamic, Nagios, etc. These tools serve different purposes to the DevOps teams in their work.

In this ServiceNow also have some in-demand modules help to serve different purposes. These are ITFM, HR, PA, and GRC (Governance, Risk & Compliance). Furthermore, there are Analytics, CRM, Intelligence and Reporting, IT business management, etc. Learn more skills from Servicenow Course online

Reasons to use ServiceNow

It is used because of its different features. The companies are turning themselves to use these features broadly. These include; simple to use, fast platform, powerful platform, pre-equipped with various plans. It is a powerful platform that includes application development, financial services management, HR services management, IT services, security operations, and management, etc.

These are the few best reasons for which the services are used by every organization. Furthermore, it is simple to use and handle. It provides various helpdesk services to the issues that occur while its usage.

ServiceNow Enterprise DevOps

ServiceNow DevOps shortens the friction between the IT operations along with development by integrating scalable enterprise DevOps. Enterprise DevOps is a collaborative work that approaches the desire to overcome the way we used in software building. The launch of Enterprise DevOps helps the companies to adopt the features of DevOps. Get more from Servicenow Certification

This is the collaboration to provide integrated services to the IT ops team with proper control, transparency, security, agility, flexibility, and accurate speed. This workflow includes coding, testing, software planning, and deploying operations.

Furthermore, this collaboration aims to produce continuous software to the worldwide cloud platforms. This will help enterprises faster to work.

Benefits of DevOps ServiceNow Integration

This integration helps in many ways. It includes the following benefits. :-

  • It helps to make better and faster decisions for both teams while developing software or providing any customer service.
  • It helps the customer to get the full context of their requirements along with their priorities.
  • Moreover, it accelerates the customer response time with jet fast.
  • It also leverages the delivery ecosystem with bets functionality and collaboration.
  • It provides visibility to customer issues and helps to solve them early.
  • It ensures customer service priority with less manual communications.
  • Furthermore, it categorizes customer tickets to transfer them to Azure DevOps. For more info Servicenow Developer Training
  • It is simple to use.
  • It resolves not only IT related issues but also the issue of various sectors. It also provides various services that help to use these services at par.

Moreover, it also ensures complete traceability of different services. The business unit development mainly depends upon their customer behavior. 

Azure DevOps ServiceNow Change management

This extension of this service enables its integration of change management with Azure pipelines.

This integration requires the installation of the Azure pipeline’s application on this platform. It enables to create user accounts in this platform. The user can create and edit the details on this platform.

Later it requires building service connection with ServiceNow in Azure pipelines. This connection in Azure DevOps stores the connection details that are useful for external services. Want to learn new skills then go with Best Servicenow Online Training

This connection service supports two different types of authentication. Such as; basic authentication, and OAuth2. Basic authentication needs a service account for a user. While to grant a position to the user in this platform, it requires DevOps to register here.

Next, it requires to create Gate service. Now, we will configure a release pipeline. It requires adding a pre-deployment gate for this change management.

The following are the inputs for the Gate. It includes the following things, such as; short description, Description, category, risk, priority, Impact, Group assignment, Schedules, etc. The output variables include Change Request Number and the System ID.

Now, we have to put Gate success criteria. It includes Desire state, where the Gate will be successfully established and the pipeline will continue when the change request will be the same as the value given. Moreover, it includes many things to create and maintain while creating an extension. This change may have many uses.

ServiceNow ticketing tool

It has introduced a new feature Ticketing tool. This tool is helpful for large companies to solve major IT issues. It works in three steps which include reporting, managing and finally resolving the issue. First, the issue is reported here, then it manages it in a very possible way and finally, it resolves it with the best solutions. Here, it works like any IT professional while solving the issues.

For reporting an issue, it can be directly intimated, sends an email, uses service desk help, etc. Managing the issue includes the following steps like assessment, assignment, and handling. Furthermore, it also provides the facility for tracking the service desk activities.

There are different types of ticketing tools available. Such as; Incident management, request management, change management, and problem management.

ServiceNow integration list

There are many companies and large organizations that use the services of this integration. As of now, the companies using this are Computer software, IT services, Hospitality and Healthcare, Financial services, etc. Furthermore, there are Higher education, Insurance, Retail, Govt. Organizations, etc. also using these features. All these are updating with the latest trends in technology. With the help of this service the companies are putting edge on their services and developments.

Along with these services, the companies are taking advantage of other supports like helpdesk, customer support and many more. Many companies from different countries are using this service.

Thus, the above writings explain the DevOps integration with ServiceNow and its different aspects. This integration with DevOps has many uses. It helps to make better decisions for both the teams and prioritizes customer support. It makes faster deliveries and services with high scalability. Making a career in this field is a great choice for new aspirants.

To excel in this field and to know more about ServiceNow and its different services and integrations one can opt for ServiceNow Online Training from various online sources. I will help to enhance the skills and would also provide a way towards career development.

To get in-depth knowledge, enroll for a live free demo on Servicenow Training

Design a site like this with WordPress.com
Get started