API Manager is a component of Anypoint Platform for designing, building, managing, and publishing APIs. Anypoint Platform uses Mule as its core runtime engine.
You can use API Manager on a public cloud, such as CloudHub, a private cloud, or a hybrid.
A hybrid deployment is an API deployed on a private server but having metadata processed in the public cloud.
Sign up to use Anypoint Platform on the Internet or private cloud, or obtain credentials from your administrator. For more additional info Mule Training
If you set up your own account in Anypoint Platform, you are assigned the Organization Administrators role. In this role, you see and can open API Manager when you log into Anypoint Platform.
Alternatively, you can use the Anypoint Platform command line interface (CLI) to interact with API Manager.
The lifecycle of an API involves setup and deployment, management, and engaging users on an API Portal. To perform these tasks, you might need to obtain permissions and roles from the Anypoint Platform administrator for your organization.
The workflow, color-coding tasks as follows:
Administrative task (orange).
API or app developer task (blue).
You can use the auto-discovery process to register and start a Mule app. API Manager can discover and register a Mule app deployed from Studio without user intervention.
You can deploy an API implementation to a Mule Runtime that runs on a server, such as CloudHub in the public cloud. You can also deploy to a private cloud or hybrid.
A hybrid deployment is an API deployed on a private server but having metadata processed in the public cloud.
Regardless of the server location, each server belongs to an environment, such as Dev, Test, or Production.
Servers are defined in business groups and each business group can have multiple environments.
You publish your API on the portal and apps request access, as depicted in the following diagram. For more skills learn from Mulesoft Training
In API Manager 2.x, you can use instances of an API in multiple environments, and you can have multiple instances in the same environment. You can create one instance of an API that serves as a proxy.
You can create another to manage directly as a basic endpoint. You can apply caching policies, for example to the proxy, and throttling policies and security policies to the basic endpoint.
Through the Autodiscovery scheme, API Manager can track the API throughout the life cycle as you modify, version, deploy, govern, and publish it. API Manager 2.x is tightly integrated with the following tools:
Design Center for creating the RAML structure of the API
The API design capabilities of Design Center replace the Jul 2017 API Designer tool in Anypoint Platform.
Exchange for storing and publishing API assets
Assets are components, such as API versions, templates, and connectors owned by MuleSoft or your organization.
Studio for implementing the API
API changes made in Studio are synchronized using Autodiscovery with the API registered in other tools, such as Design Center and API Manager.
API Manager manages APIs that reside in Exchange or imports the APIs in a ZIP file from the file system. The ZIP is an API object that you exported from API Manager. You set up the API for RAML, HTTP, or WSDL management.
RAML/OAS for REST APIs: Provide the REST API Markup Language source, which you can write using Design Center.
HTTP for REST APIs without a specified spec (RAML or OAS): Provide the URL of the inbound HTTP or HTTPS endpoint.
WSDL for SOAP APIs
Provide the URL where Anypoint Platform can find Web Services Definition Language source.
After configuring the API, you can perform API management tasks, applying policies and setting up SLA tiers, assigning permissions to the API environment, versioning, viewing API analytics, and more.
The great news in the integration world fell like a flash from the sky: Salesforce bought MuleSoft! In a way, the IT world is rediscovering integration, this old topic is becoming a hot topic!
In 2016, Google acquired APIGEE’s API Management solution; last year saw MuleSoft’s spectacular IPO, and now its acquisition by Salesforce.
But Why Acquire an Integration Solution?
Well, because the market is already growing very fast, with cloud integration needs and iPaaS solutions. And, MuleSoft offers one, of course! And, Salesforce obviously has a major interest in Mulesoft’s solution integrating well with its own solution.
Indeed, a failed integration may reflect poorly on its own solution, through no fault of their own. So, this is about control—if Salesforce controls the solution, then they can have more control over potential failure. Learn more from Mulesoft Training
I recently had the opportunity to study the capacity of Salesforce integration solutions, and between solutions that did not satisfy me because of their quality and those that work but remain dependent on Salesforce billing, I didn’t see much that impressed me.
In short, everything works together so that Salesforce has its integration solution, which we hope will not be dependent on Salesforce billing.
But Why MuleSoft and Not Another?
The market has evolved quite a bit, but the visibility of this world is perhaps quite low. Gartner has not evaluated ESB in its magic quadrant since 2014, and players like MuleSoft or Talend have since emerged. Learn Practical skills from Mule Training
The actors have changed, and MuleSoft has a complete quality suite. There are still a number of areas for improvement, but the same can be said for many actors.
Will MuleSoft Remain Independent?
For this article, the answer is yes! And the reason is simple: to sell MuleSoft at a low cost to its Salesforce customers, it will also be necessary to sell to non-Salesforce customers in order to maximize MuleSoft’s size.
In addition, based on Heroku’s feedback, we can bet on great connection facilities between Salesforce and MuleSoft but, nonetheless, Mule will still remain independent.
What’s the Future for MuleSoft?
MuleSoft has given itself exceptional visibility and is quickly becoming the de facto choice for new integration needs.
If we add Salesforce’s investment capabilities, we can surely count on a great leap forward in MuleSoft’s functionalities. For additional info Mulesoft Certification
What’s the Future for the Competition?
There is a good chance that the GAFAM have spotted this news, and are wondering what they can do to keep up.
I wouldn’t be surprised if a Microsoft (which also owns a competitor of Salesforce) or a Google that wants to make very big buyouts will study the issue very quickly. The headlines on the world of integration are not, in my opinion, finished!
Anypoint Runtime Fabric is a container service that automates the deployment and orchestration of Mule applications and API gateways. Runtime Fabric runs within a customer-managed infrastructure on AWS, Azure, virtual machines (VMs), and bare-metal servers. Get practical skills from Mulesoft Certification
Some of the capabilities of Anypoint Runtime Fabric include:
Isolation between applications by running a separate Mule runtime per application.
Ability to run multiple versions of Mule runtime on the same set of resources.
Scaling applications across multiple replicas.
Automated application fail-over.
Application management with Anypoint Runtime Manager.
Runtime Fabric and Other PaaS Providers
Anypoint Runtime Fabric contains all of the components it requires. These components, including Docker and Kubernetes, are optimized to work efficiently with Mule runtimes and other MuleSoft services. For more info Mulesoft Online Training
If you are already using a PaaS solution, MuleSoft recommends deploying Runtime Fabric in parallel with your PaaS. This enables you to take advantage of the complete benefits of Anypoint Platform.
Connecting Runtime Fabric to Anypoint Management Center
Anypoint Runtime Fabric supports the following:
Deploying applications from Anypoint Runtime Manager.
Deploying policy updates of API gateways using API Manager.
Storing and retrieving assets with Anypoint Exchange.
To enable integration with the Anypoint Management Center, Runtime Fabric requires outbound access to Anypoint Platform on port 443. This connection is secured using mutual TLS.
A set of services running on the controller VMs initiates outbound connections to retrieve the metadata and assets required to deploy an application.
These services then translate and communicate with other internal services to cache the assets locally and deploy the application. Learn more from Mulesoft training
Anypoint Runtime Fabric and Standalone Mule Runtimes (Hybrid Deployments)
Hybrid deployments of Mule applications require you to install a version of the Mule runtime on a server and deploy one or more applications on the server. Each application shares the Mule runtime and the resources made available to it.
Other resources such as certificates or database connections may also be shared using domains.
Anypoint Runtime Fabric provisions resources differently. Each Mule application and API gateway runs within their own Mule runtime and in their own container.
The resources the container can access is specified when deploying a Mule application or API proxy. This enables Mule applications to horizontally scale across VMs without relying on other dependencies.
It also ensures that different applications do not compete with each other for resources on the same VM.
Checklist for Using Anypoint Runtime Fabric
The following sections list the general requirements and considerations for successfully using Anypoint Runtime Fabric. Ensure the following criteria have been met before beginning the installation.
Consult your network or operations administrators to ensure that these are in place before installing Anypoint Runtime Fabric.
To get in-depth knowledge, enroll for a live free demo on Mule Training
Anypoint Platform is a unified, hybrid, highly productive integration platform that allows developers to create a seamless application network of applications, data, and devices.
Anypoint Platform provides us the runtime, tool, framework and library (central repository) for our APIs and applications.
Runtime to run and deploy our applications in the cloud or on-premise.
Tools and framework for creating an API and building simpler Mule Applications (for complex applications we go for Anypoint Studio). For more info Mulesoft Training
Library to store our assets (APIs and applications) where we can view and test them, and it can be reused by other developers in the organization.
MuleSoft Any point Platform Capabilities and Benefits
The key components of the MuleSoft platform include:
Anypoint Design Center – Development tools that make it easy to design APIs, implement integration flows, and build connectors
Anypoint Management Center – A unified Web interface for managing all aspects of the platform, including API users, traffic, SLAs, underlying integration flows, and more
Anypoint Exchange – A collaboration hub for searching for prebuilt, private and public reusable assets
Mule Runtime Engine – Combines real-time application integration and orchestration with robust data integration capabilities
Anypoint Connectors – Provides out-of-the-box assets and tools you need to connect faster and tools to develop your own. learn from Mule Training
Runtime Services – A comprehensive suite of platform services that provide enterprise-grade security, reliability, scalability, and high-availability
Key capabilities of the platform include:
APIs
Build new APIs quickly, design new interfaces for your existing APIs, and simplify API management
Rapidly expose valuable data to mobile devices, web apps, and connected devices in a secure and controlled way
Enable and empower the entire organization with Experience APIs, Process APIs, and System APIs
B2B
Modernize B2B by extending Mulesoft’s API-led connectivity approach to B2B and EDI
Build reusable services across multiple trading partners and B2B processes
Data Integration
Combines batch and real-time processing for unified application and data integration
Offers a template-driven approach to development
DevOps
Streamlines adoption of popular DevOps frameworks for continuous integration and efficient deployment
ESB
Combines the power of data and application integration across legacy systems and SaaS applications, with a seamless path to benefitting from other capabilities in the Anypoint platform and API-led connectivity
Quickly build integrations ranging from simple to advanced with pre-built connectors and templates,
Internet of Things (IoT)
Connect and orchestrate data from your enterprise and the cloud to devices at the edge of your network—including point of sale systems, medical devices, sensors, and more—using open standards, developer-friendly tools, and out-of-the-box transport protocols.
Microservices
Enables your organization to develop new solutions in a manageable, resusable, and governed way
Mobile
Enables fast, easy, and governed mobile access to any data from backend systems, legacy databases, and SaaS Applications
If Mule ESB needs to built on top of Java and Spring, the strong integration capabilities on Mule ESB helps to invoke Java method. There are many ways to invoke java method into Mule ESB flow.
Based on our expertise at Massil Technologies, we are going to discuss the most effective and most simple method know as “Invoke Component”.
Invoke Component:
By using the Invoke component, we can invoke a specified method of an object defined in a Spring Bean. We can provide an array of argument expressions to map the message to the method arguments.
We provide the method name, and with that, Mule determines which method to use, along with the number of argument expressions provided.
Mule automatically transforms the results of the argument expressions to match the method argument type where possible. For more additional info Mulesoft Training
If you have some advanced objects as your input argument, you can always give the argument types along with the argument array.
Note: Mule does not support multiple methods with the same name and number of arguments, which most of us Java developers know as method overloading.
Configuring the Invoke Component:
Use the invoke component when you have an existing method defined in custom Java code that you wish to use in processing a message. Configuring an invoke message processor involves two steps:
Include the object that contains the method in the application’s src/main/java
Configure the invoke message processor to reference the method in that object.
STUDIO Visual Editor
XML Editor or Standalone
In Anypoint Studio click File > New > Mule Project to create a new project, give it whatever name you wish and click Finish.
Step2:
Drag an HTTP Connector from the palette to your empty canvas.
Step3:
Click on the HTTP Connector to open its properties editor, then click the green arrow icon to create a new configuration for it. Leave all of the fields in default and click OK.
Example
The following example creates a “Hello World” flow with an invoke component to implement a specific method in a referenced class.
Right-click src/main/javaand click New > Class.
Paste the code that follows and save your project.
If you have not done so already, click Run> Run As > Mule Application.
What is Mule ESB? Mule ESB is a Java-based enterprise service bus (ESB) and integration platform, developer can connect their application with ESB. Mule use service oriented architecture.
Apart from of the different technologies the applications use, including JMS, Web Services, SMTP, HTTP. The advantage of ESB, it’s allow communicate different application.
Messages can be any format SOAP to JSON. Mule ESB Development provide messaging framework that enable exchange of data among application.
Why Mule ESB? Mule ESB is lightweight integration framework but highly scalable, allowing you to start small application and connect multiple applications.
Mule manages all the interactions between applications and components transparently, ESB will take care of the multiple application, we can easily integrate third party application using Mule. For more info Mulesoft Online Training
What are all the Primitives used in Mediation?
We have different types of primitives in mediation.
Message Filter
Type Filter
Endpoint Lookup
Service Invoke
Fan-out
Fan-in
XSLT
BO Map
Message Element Setter
DB lookup
Data Handler
Custom Mediation
Header Setters
Message Logger
Even Emitter
Stop
Fail
Sub Flow
What is Shared Context?
Shared Context: Context is a temporary area which is created along with Service Message Object (SMO) in the Mediation Flows. Shared Context is a type of context which is present in the SMO. Shared Context is mainly used when we are using Aggregation process where we need to Iterate the BO for Certain times.
Shared Context maintains Aggregation data between Aggregation (FanOut and FanIn) primitives. The Content (data) which is present in the shared context BO does not persist across Request and Response flows i.e. The Data in the Shared Context which is used in Request flow can not be used again in Response flow.
What is Transient Context?
Transient Context: Used for passing values between Mediation primitives within the current flow — either the request flow or the responses flow. The transient context cannot link requests and responses and hence cannot be used across.
Used when you want to save an input message before a service invokes call (within a request or response flow). After the services invoke call, the next primitive can create another message by combining the service invoke response and the original message stored in the transient context. Learn more skills from Mulesoft Training
What Difficulties Mule Does Encompass ?
Transport: applications can accept input from a variety of means, from the file system to the network.
Data format: speaking the right protocol is only part of the solution, as applications can use almost any form of representation for the data they exchange.
Invocation styles: synchronous, asynchronous, or batch call semantics entail very different integration strategies.
Lifecycles: applications of different origins that serve varied purposes tend to have disparate development, maintenance, and operational lifecycles.
What Are Available Esbs Apart From Mule ?
All major JEE vendors (BEA, IBM, Oracle, Sun) have an ESB in their catalog. It is unremarkably based on their middleware technologies and is usually at the core of a much broader SOA product suite.
There are also some commercial ESBs that have been built by vendors not in the field of JEE application servers, like the ones from Progress Software, IONA Technologies, and Software AG.
What are the various types of Exception Handling?
Global Exception Handling
Catch Exception Handling
Choice Exception Handling
Default Exception Handling
Rollback Exception Handling
What are the characteristics of Mule ESB?
An ESB is used for the purpose of integration with an approach that is service-oriented. Its features include:
Message Routing Service
Message Transformation Service
Set of Service Container
Web Service Security
In Mule, how do you develop and consume SOAP services?
SOAP services can be created just like how we create a Mule project by using RAML. The difference here is that we need Concert WSDL importing rather than RAML. And SOAP services can be consumed by using our Mule flow CXF component or Web Service Consumer. Get additional knowledge from Mule training
How can you find out whether your project requires ESB?
As every project might not require an ESB, you should analyze first to see if your project might benefit from ESB implementation. Certain things that should be at the front of your mind while you analyze the need for ESB are:
If the project requires integration of more than 3 applications or services and if communication between two application is needed, it would be enough to use point to point integration
Sometimes there will be a need for you to scale the project in the future where there might arise a need to interact with multiple services. This is required only by a few projects that perform heavy tasks
If the project requires message routing abilities such as aggregating and forking message flows. This feature is not necessary for all projects
You should have clarity on the architecture of the thing that needs to be achieved. A simple POC integration of small parts to find out the benefits is much better
As most of the ESBs are on the expensive side, first evaluate whether your project budget permits ESB use
What are the different kinds of Flow Processing Strategies?
The following are the six kinds of Flow Processing Strategies:
Thread Per Processing Strategy
Custom Processing Strategy
Queued Asynchronous Flow Processing Strategy.
Asynchronous Flow Processing Strategy
Synchronous Flow Processing Strategy
Queued Flow Processing Strategy
Non-blocking Flow Processing Strategy
What is RAML and why we use it?
RAML – RESTful API Modeling Language
RAML is similar to WSDL, it contains endpoint URL, request/response schema, HTTP methods and query and URI parameter.
RAML helps client (a consumer of the service) know, what the service is and what/how all operations can be invoked.
RAML helps the developer in creating the initial structure of this API. RAML can also be used for documentation purpose.
Why the Mulesoft is preferred than other ESB implementations?
Mule is lightweight but highly scalable, allowing you to start small and connect more applications over time. The ESB manages all the interactions between applications and components transparently, regardless of whether they exist in the same virtual machine or over the Internet, and regardless of the underlying transport protocol used.
Several commercial ESB implementation provides limited functionality or built on top of an existing application server or messaging server, locking you into that specific vendor. Mule is vendor-neutral, so different vendor implementations can plug into it. You are never locked in to a specific vendor when you use Mule.
What Is Global Endpoint In Mule?
An endpoint destination that is shared by several routers, it is worth creating a global endpoint. A global endpoint is not typified for inbound or outbound routing, making it usable in many different places in a configuration file.
It must be named so it can actually be used in a service, which will reference the global endpoint by its name. A global endpoint can also help clarify the usage of a particular destination.
Mule Enterprise Service Bus is a middleware technology that quickly, easily, and securely connects the enterprise. Unlike typical middleware software, Mule as an ESB is a Java-based middleware solution that is easy to use and easy to scale.
What is Mule Cache Scope and what are its storage types?
Caching in Mule ESB can be done by Mule Cache Scope. Mule Cache Scope has 3 storage types –
In-memory: This store the data inside system memory. The data stored with In-memory is non-persistent which means in case of API restart or crash, the data been cached will be lost.
Configuration Properties:
Store Name
Maximum number of entries
TTL (Time to live)
Expiration Interval
Managed-store: This stores the data in a place defined by ListableObjectStore. The data stored with Managed-store is persistent which means in case of API restart or crash, the data been cached will no be lost.
Configuration Properties:
Store Name
Maximum number of entries
TTL (Time to live)
Expiration Interval
Persistence (true/false)
Simple-test-file-store: This stores the data in a file. The data stored with Simple-test-file-store configuration is persistent which means in case of API restart or crash, the data been cached will no be lost.
Apigee Edge is a single platform for API management, developer services and analytics. Functions include API design, security, publishing, monitoring and monetisation, as well as microservice management.
Users praised the analytics functionality, but were sceptical about the use of Swagger, which developers may need to learn to work with the tool.
Apigee also provides related services like Apigee-127, which developers can use to design and build enterprise-class APIs in Node.js and deploy them on any Node.js system.
What is Mulesoft’s Anypoint Platform?
Mulesoft created the Anypoint Platform by merging API management into its Mule integration platform in 2013. Although this can be an attractive proposition, companies that already have an integration platform may find the Anypoint Platform too much for them. For more additional info Mulesoft Training
Apigee vs Mulesoft Pricing
Apigee subscriptions are divided into four tiers, ranging from the free Evaluation tier (one user, one environment; 100,000 API calls a month; 30 days of analytics reports) to the Enterprise tier (10 organisation/environment combinations; 10 billion API calls a year; 12 months of reports).
Users nearly unanimously mentioned the high cost, but said that it is worth paying for the value and functionality gained.
Mulesoft pricing is divided into three tiers: Gold, Platinum and Titanium. All three cover base functionality like management, API design and ‘unlimited’ API portals.
Connectors require an additional premium. The Platinum and Titanium tiers include enterprise features like external identity management and business groups.
Advantages:
Apigee:
It’s a good return on my investment.
The ease of creating policies has been the most useful of the solution’s features.
It’s a quick ramp-up time.
It’s easy for our support staff to implement the policies in the API management layer.
I have found the most valuable features to be tracing a proxy, and managing proxy versions and revisions via the Edge UI component.
I have not encountered instability with the product.We use it to build API proxies for securing targeted back-ends with an emphasis on Continuous Integration/Continuous Development (CI/CD).
It accelerates development and deployment processes.
Mulesoft:
The most valuable feature is the ability to investigate APIs.
The ESB, the enterprise service bus is what we primarily use. In addition to that, the API management. These are the two tools which we have been using extensively. The enterprise platform. Get additional skills from Mulesoft Online Training
Apigee vs Mulesoft
The Apigee vs Mulesoft question is not a simple one to answer. For example, pricing is not an easy comparison to make as neither firm discloses cost information.
Potential customers will also struggle to make a decision based on the parent company, as they might do in other technology markets.
Google has a reputation for innovation, while Salesforce is firmly entrenched as an enterprise solution, but in reality this has so far had little effect on how the companies are run as business units – although Google has focused on Apigee’s performance as a cloud product.
The choice will eventually come down to technical capabilities, where both Apigee and Mulesoft have their own particular strengths.
Apigee is more mature in microservice management, while Mulesoft is ahead in integration capabilities.
Both companies are working to close these gaps – see Mulesoft’s October 2019 release and Google’s Anthos platform – so prospective clients will soon need to look even deeper to find differentiation between the two.
Ultimately, for general purposes, the choice is likely to be a business decision as much as a technical one.
The MCD – Integration Professional exam is a highly technical exam, designed to assess true expertise as a MuleSoft integration developer. The exam verifies strong software development skills, broad and deep Anypoint Platform knowledge, and appreciable experience using MuleSoft solutions on multiple types of projects.
Preparation recommendations:
To prepare for taking the MCD – Integration Professional exam, you need to learn and have development experience with the topics listed in the Exam Topics section of this guide. This knowledge is best achieved by following this path:
Take the Anypoint Platform Development: Fundamentals training course, the self-paced MuleSoft.U Development Fundamentals course, or gain equivalent knowledge.
Pass the MCD – Integration and API Associate exam (this is suggested but not mandatory).
Take the Anypoint Platform Development: Advanced training course or gain equivalent knowledge.
Get 6-months to several years of hands-on Mule project experience. Note that actual development experience is required in addition to attending the training courses.
Real project experience is needed to pass this challenging exam, not merely classroom or study time.
Required experience: This test is designed to verify product expertise gained through significant experience using MuleSoft products on a variety of projects. There is no official requirement for the amount of experience you need to pass the exam, but history shows that people who pass this challenging exam have months or years of real MuleSoft project experience. For more info Mulesoft Training
Exam cost: The exam fee is $250 USD per person per attempt.
Exam format:
Multiple-choice
closed-book
proctored Number of questions: 100
Duration: Up to 2 hours are permitted
Language: English
Delivery method: In a Kryterion testing center location or online Availability: Both in person and online exams need to be scheduled in advance
Testing location:
There are two options for taking the exam:
In any testing center location within Kryterion’s worldwide network
Online from any internet-connected computer using Kryterion Webassessor’s Online Proctoring service and a qualifying, external webcam.
Exam pass requirement: A score of 80% or higher is required to pass the MCD – Integration Professional exam.
Exam topics: The topics covered in the MCD – Integration Professional exam are listed here. Get practical skills from Mule Training
General:
Understanding basic MuleSoft implementation and design concepts
Understanding data movement through an application
Basics:
Understanding Mule applications, flows, messages, and message processors
Using flow variables and session variables
Writing Mule expressions
Defining Mule properties and creating properties files
HTTP Connector:
Creating and configuring inbound and outbound HTTP endpoints
Using HTTP and HTTPS
Understanding HTTP content-type and the effect on browser types
Flows:
Using flows, sub-flows, and flow references
Understanding the differences between inbound and outbound endpoints
Configuring flow processing strategies
Coding and testing exchange patterns (like request-response and one-way)
Testing using JUnit and MUnit cases with Mule applications
Sending a Mule message from a test class to a Mule application
Flow Control:
Using splitters, aggregators, and multi-cast routers
Using the For-each scope
Using filters
Error Handling:
Debugging flows and expression handlers
Understanding the different exception strategies that are available
Using exception strategies and understanding how they affect flows and sub-flows
Changing and returning a message from an exception strategy
Configuring global application exception handling
Using routers (like First Successful and Until Successful) to handle potential error conditions
Transformations with DataWeave:
Using the DataWeave Transform Message component
Writing DataWeave expressions
Using DataWeave with data sources that have associated metadata
Adding custom metadata to data sources
Web Services:
Implementing REST services with GET, POST, PUT, and DELETE methods
Using annotations on REST methods to create unique signatures
Creating REST clients and working with dynamic endpoints
Publishing and consuming SOAP messages
Using CXF interfaces to create service definitions
Extending interfaces to create CXF implementations
Scopes:
Configuring and using batch processing
Using the Cache Scope to store and reuse frequently called data
Creating and managing caching strategies
Using Enrichers to enhance a Mule message
Deployment:
Understanding the general concepts and benefits for building Mule clusters
Managing runtime clusters
Using queues to distribute application flows for processing in clusters
Understanding how clustering supports various Mule transport mechanisms
Deploying applications on-prem using Mule Management Console
Deploying applications to CloudHub
Organizing Spring properties and spring property file configuration
Java Components:
Creating and testing Java custom components and integrating them into flows
Using advanced Java concepts to invoke service calls for passing Mule messages
Creating custom filters with Java
Configuring Java components to be prototypes or singletons
Using the default entry point resolver with Java components
Connectors and Transports:
Configuring and using Database connectors
Understanding how Database inbound and outbound endpoints differ and their limitations
Configuring JMS connectors for two-way communications, temporary queues, and object serialization over transports
Using back channels and creating two-way communication through JMS connections
Understanding how JMS uses correlation IDs
Using VM Transport to control how messages are sent and received by components in a system
Using VM Transport for communication between Mule flows
Understanding queue usage with VM Transport and configuration structure
Configuring and using File and FTP connectors
Transactions:
Understanding transaction management and which endpoints support transactions
Managing and configuring resource transactions for inbound and outbound messages
Understanding the various transaction types and usage techniques.
To get in-depth certification knowledge, enroll for live free demo on Mulesoft Certification Course
Every worksheet in Tableau contains shelves and cards, such as Columns, Rows, Marks, Filters, Pages, Legends, and more.
By placing fields on shelves or cards, you:
Build the structure of your visualization.
Increase the level of detail and control the number of marks in the view by including or excluding data.
Add context to the visualization by encoding marks with color, size, shape, text, and detail.
Experiment with placing fields on different shelves and cards to find the optimal way to look at your data. Learn more skills from Tableau Certification
Options for starting a view
If you aren’t sure where to place a field, you can get let Tableau help you determine the best way to display the data.
You can drag fields from the Data pane and drop them onto the cards and shelves that are part of every Tableau worksheet.
You can double-click one or more fields in the Data pane.
You can select one or more fields in the Data pane and then choose a chart type from Show Me, which identifies the chart types that are appropriate for the fields you selected.
You can drop a field on the Drop field here grid, to start creating a view from a tabular perspective.
Columns and Rows shelves
Drag fields from the Data pane to create the structure for your visualizations.
The Columns shelf creates the columns of a table, while the Rows shelf creates the rows of a table. You can place any number of fields on these shelves.
When you place a dimension on the Rows or Columns shelves, headers for the members of that dimension are created. When you place a measure on the Rows or Columns shelf, quantitative axes for that measure are created.
As you add more fields to the view, additional headers and axes are included in the table and you get an increasingly detailed picture of your data.
In the view shown below, the members of the Segment dimension are displayed as column headers, while the Profit measure is displayed as a vertical axis.
Tableau displays data using marks, where every mark corresponds to a row (or a group of rows) in your data source. The inner fields on the Rows and Columns shelves determine the default mark type.
For example, if the inner fields are a measure and a dimension, the default mark type is a bar. You can manually select a different mark type using the Marks card drop-down menu.
To get in-depth knowledge, enroll for live free demo on Tableau Training
Adding more fields to the Rows and Columns shelves adds more rows, columns, and panes to the table.
Hide rows and columns
Generally you will add dimensions and measures to create the rows and columns of the table and you’ll either include all data or add filters to only show a subset.
However, when you filter data it is also excluded from calculations and other computations performed on the summarized data in the table.
Instead of filtering the data, you can hide the row or column so it doesn’t display in the view but it is still included in calculations.
Hiding columns is especially useful when using table calculations that compare to a previous or next date value.
In these cases, there will be a row or column that doesn’t show data because there is no data to be compared to. Hide the empty column to keep the table calculation intact.
To hide a row or column:
Right-click (control-click on Mac) the row or column you want to hide, and then select Hide.
Marks card
The Marks card is a key element for visual analysis in Tableau. As you drag fields to different properties in the Marks card, you add context and detail to the marks in the view.
Filters shelf
The Filters shelf allows you to specify which data to include and exclude. For example, you might want to analyze the profit for each customer segment, but only for certain shipping containers and delivery times. By placing fields on the Filters shelf, you can create such a view.
You can filter data using measures, dimensions, or both at the same time. Additionally, you can filter data based on the fields that make up the columns and rows of the table. This is called an internal filter.
You can also filter data using fields that don’t contribute headers or axes to the table. This is called an external filter. All filtered fields display on the Filters shelf.
Suppose you are not interested in the Home Office data. You can remove this column from the view by filtering the Segment dimension. To do so, select Filter on the field menu or drag the Segment dimension to the Filters shelf.
The Filter dialog box opens. By default all members are selected. Clear the check box for Home Office to exclude it from the view. All selected members will be included.
Suppose you want to only view profit for a category of the products. Even though the Category field is not used on the Rows and Columns shelves or on the Marks card, you can still add a filter.
Drag the Category dimension to the Filters shelf. This is an example of an external filter because Category is not part of the view.
The Filter dialog box automatically opens. By default, none of the members are selected. Select the members you want to keep as part of the view. All cleared members are excluded. In this example, Office Supplies is selected.
The modified data view is shown below. The mark label shows that the sum of the profit for the Consumer segment has decreased to $56,330.
This number is derived by summing all the rows in the data source that are associated with the Corporate market and are part of the Office Supplies category.
Pages shelf
The Pages shelf lets you break a view into a series of pages so you can better analyze how a specific field affects the rest of the data in a view. When you place a dimension on the Pages shelf you are adding a new row for each member in the dimension.
When you place a measure on the Pages shelf, Tableau automatically converts the measure into a discrete measure.
The Pages shelf creates a set of pages, with a different view on each page. Each view is based on a member of the field you placed on the Pages shelf.
You can easily flip through the views and compare them on a common axis, using the controls that get added to the view when you move a field to the Pages shelf.
Additional Shelves, Legends, Cards, and Controls
Some shelves, legends, cards, or controls are only displayed as a result of things that you do as you work with views.
Tableau provides controls for moving or otherwise customizing these elements of the view.
The following list describes each such shelf, legend, card, or control.
Measure Values is a special field that always appears at the bottom of the Measures area of the Data pane and contains all the measures of your data collected into one field.
Tableau automatically adds Measure Values to the view when multiple measures are sharing the same axis.
When Measure Values is in the view, Tableau displays a Measure Values shelf that shows which measures are being included. You can add measures to or remove measures from this card.
Shows how colors are allocated when there is a field on Color.
Shows how shapes are allocated when there is a field on Shape.
Shows how sizes are allocated when there is a field on Size.
Shows the legend for the symbols and patterns on a map. The map legend is not available for all map providers.
A separate parameter control is available for every parameter in the workbook.
A title is displayed by default for every view. The default title is the sheet name. Double-click a title (Control-click on a Mac) to edit it.
Choose Show caption from the Worksheet menu to display a caption for the view. Summary Card – Choose Show summary from the Worksheet menu to display a summary card for the view.
Page Control – Provides options for navigating through pages when there is a field on the Pages shelf.
The JavaProperties class, java.util.properties, is like a Java Map of Java String key and value pairs. The Java Properties class can write the key, value pairs to a properties file on disk, and read the properties back in again. This is an often used mechanism for storing simple configuration properties for Java applications.
Create a Properties Instance
To use the Java Properties class you must first create a Properties instance. You do so via its constructor and the Java new instruction. Here is an example of creating a Java Properties instance: For more additional info Java Certification Course
Properties properties = new Properties();
Set Properties
To set properties in a Java Properties instance you use the setProperty() method. Here is an example of setting a property (key – value pair) in a Java Properties object:
To get properties from a Java Properties object you use the getProperty() method, passing the key of the property to get as parameter. Here is an example of getting a property from a Java Properties instance:
String email = properties.getProperty("email");
Remove Properties
You can remove a property from a Java Properties instance using its remove() method, passing as parameter to remove() the key for the property to remove. Here is an example of removing a property from a Java Properties instance:
properties.remove("email");
public class Properties:
extends Hashtable<Object,Object>
The Properties class represents a persistent set of properties. The Properties can be saved to a stream or loaded from a stream. Each key and its corresponding value in the property list is a string.
A property list can contain another property list as its “defaults”; this second property list is searched if the property key is not found in the original property list.
Because Properties inherits from Hashtable, the put and putAll methods can be applied to a Properties object. Their use is strongly discouraged as they allow the caller to insert entries whose keys or values are not Strings. Learn more skills from Java Online Course
The setProperty method should be used instead. If the store or save method is called on a “compromised” Properties object that contains a non-String key or value, the call will fail. Similarly, the call to the propertyNames or list method will fail if it is called on a “compromised” Properties object that contains a non-String key.
The load(Reader) / store(Writer, String) methods load and store properties from and to a character based stream in a simple line-oriented format specified below. The load(InputStream) / store(OutputStream, String) methods work the same way as the load(Reader)/store(Writer, String) pair, except the input/output stream is encoded in ISO 8859-1 character encoding. Characters that cannot be directly represented in this encoding
Iterate Properties
You can iterate the keys of a Java Properties instance by obtaining the key set for the Properties instance, and iterating this key set. Here is an example of obtaining the key set of a Java Properties to iterate all its keys:
You can store the property key, value pairs to a properties file which can be read again later on. You store the contents of a Properties object via its store() method. Here is an example of storing the contents of a Java Properties to a properties file:
Properties properties = new Properties();
properties.setProperty("property1", "value1");
properties.setProperty("property2", "value2");
properties.setProperty("property3", "value3");
try(FileWriter output = new FileWriter("data/props.properties")){
properties.store(output, "These are properties");
} catch (IOException e) {
e.printStackTrace();
}
Property File Encoding
By default the Java Properties file encoding is ISO-8859-1 (Latin-1). However, these days it is more common to use UTF-8 as encoding. You can specify the file encoding to use as the second parameter of the Java FileWriter used when the file is stored. Here is an example of setting the Java Properties file encoding (charset) to UTF-8: For live free demo, enroll for Learn Java Online
try(FileWriter output = new FileWriter("data/props.properties", Charset.forName("UTF-8"))){
properties.store(output, "These are properties");
} catch (IOException e) {
e.printStackTrace();
}
Property File Format
A Java Properties property file consists of lines with one key=value pair on each line. Here is an example Java Properties property file:
#These are properties
#Thu Jul 04 21:29:20 CEST 2019
property2=value2
property1=value1
property3=value3
The lines starting the # are comments. Notice the first line of the properties file is actually the comment that was passed as second parameter to the store() method call in the code example in the previous section about storing properties to a property file. The lines following the key=value format contain the property key, value pairs.
Store Properties to XML File
The Java Properties class can also write the key-value pairs stored in it to an XML file via its storeToXML(). Here is an example of storing a Java Properties to an XML file:
Properties properties = new Properties();
properties.setProperty("property1", "value1");
properties.setProperty("property2", "value2");
properties.setProperty("property3", "value3");
try(FileOutputStream output = new FileOutputStream("data/props.xml")){
properties.storeToXML(output, "These are properties");
} catch (IOException e) {
e.printStackTrace();
}
Property XML File Encoding
By default the Java Properties XML property file encoding is UTF-8. Note, that this is the reverse default of non-XML property files. If you need to use another encoding for the XML file, this is possible. You can specify the file encoding to use as the third parameter to the storeToXML method. Here is an example of storing a Java Properties to XML using the ISO-8859-1 encoding:
try(FileOutputStream output = new FileOutputStream("data/props.xml")){
properties.storeToXML(output, "These are properties", Charset.forName("ISO-8859-1"));
} catch (IOException e) {
e.printStackTrace();
}
XML Property File Format
The Java Properties object stored to XML file in the example in the previous section looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE properties SYSTEM "http://java.sun.com/dtd/properties.dtd">
<properties>
<comment>These are properties</comment>
<entry key="property2">value2</entry>
<entry key="property1">value1</entry>
<entry key="property3">value3</entry>
</properties>