At the end of May, I had the privilege to attend the Hitachi Customer Conference. This was the first time attending and not presenting at a Conference in 4 years! So it was a great opportunity for me to really dig in and learn.
I also was able to attend the App in a Day training before the conference. Expect a recap coming soon. I would highly recommend this training as it helped pull everything together and “officially” learn about Canvas apps, model-driven apps, Flow and more!
Here is the recap from some of the sessions I attended. I have also included a few pictures from the event below.
This session gave a great overview of the Business Application Solution Ecosystem (BASE), addressed common questions and concerns plus went over real-life examples where apps have been used to improve processes.
Canvas apps are great for getting excel processes to a more automated means. Model Driven apps are helpful when there are more complex relationships in play. Then add process flows on top to help with the complex processes surrounding that data.
Security is a major area of concern for most companies. Keep in mind that PowerApps will not allow access to anything the user cannot already access. So make sure you are reviewing your access policies and security roles. Keep in mind that anything blocked via the network will not be accessible via PowerApps (for example, if you can’t login to Twitter then PowerApps can’t connect to Twitter). There are also Data Policies that allow you to configure rules for connection use and its interaction with business data.
How can we make great PowerApps experiences? Find ways to keep the user engaged. Make it easy to have simply, quick data entry and only provide the necessary data based on context. For mobile, remember to think about one-handed use. Users will be more likely to enter data if they see quality data already there, find ways to help them keep it clean!
Some examples discussed included a centralized ticketing system and submitting maintenance requests.
Go with the Flow
Joel went over the basics to know before starting Flow and also went through some examples of common uses.
Why should we be using Flow? Eventually Flow is going to replace background workflows. Currently Workflows are still needed in real-time use cases but Flow should be able to handle most background processes. This means that creating workflows is just creating technical debt in your system. Flows also have a far superior design experience as you can edit, move steps, etc. Plus Flows can connect with other systems and do much higher-level automation.
Similar to the PowerApps security discussion above, there is nothing that you can do with Flow that cannot be done manually. You can create a Flow to run under a service account that is not tied to a specific user if needed. System Flows can run as the owner of the Flow (like a Workflow can) based on actions in the system, this can cause actions to occur that the triggering user does not have the ability to perform.
Flow vs Logic Apps: Flow and Logic Apps are built on the same platform with the same performance. Logic Apps are billed per run so it is recommended to see how much can be done with Flow before moving to Logic Apps. If it is modeled as a Flow, you can export this, edit the code and then import as a Logic App so you don’t need to start again.
A few common examples we discussed include Approvals, Notifications and Data Quality operations such as contact scoring, Data8 Address Validation, and comparing emails with an exclusion list.
To Troubleshoot, you can open the Flow record and review which steps have completed and see where the issue occurred. You can also see all runs for a particular Flow. If you need to see all Flows in the system, this is available in the Admin center.
Keep in mind that Flow runs outside of Dynamics 365. So your Flows should not impact the performance of your environment as a workflow has the potential to do.
Everyone deals with issues of duplicate contacts or contacts with incorrect data. In this session, we discussed how to determine the size of the issue you are dealing with and next steps to take.
The first step is to build a game plan. This includes taking an audit of your data to find issues and scope. This includes talking to users and finding out the issues they are having. This should help you build a written proposal to drive executive sponsorship and set a plan for next steps. Look out for data issues caused by design – for example, taking too long to search, too many required fields, duplicates from other data sources, etc.
It is very important to have a data steward. This is a person who reviews data updates being made to find, identify, and resolve new data issues. They are continually working with users to get feedback and also handle training on how to enter good data. The data needs to be reviewed regularly to ensure the issue does not continue to grow.
Be aware of data syncing with Outlook. The Outlook sync is a great idea but can cause duplicates in Outlook or D365. Be sure to review your filters before turning on and then determine what actually needs to sync.
Duplicate Detection can be used to prevent some duplicates (manual or Outlook) but there are limitations. Consider looking at third party tools such as Data8 to assist with this.
Finally review the alternate key options available. You cannot have duplicate keys so creating a good primary key can ensure clean data. Just ensure that you do not create this too strict which will cause users to enter bad data to get around the system.
The 4 Elements you Need to Successfully Develop Custom Solutions
Here we focused on the Stakeholder Experience piece of delivering solutions. There are 4 elements that work together at creating a great stakeholder experience: High Quality, Predictable Delivery, Frequent Feedback, and Embracing change. These qualities should be considered whenever reviewing a process change to ensure it helps meet these goals.
High Quality means that developers are responsible for testing their code before sharing it. Then the Quality Assurance team is responsible to validate that the solution is fit for it’s intended purpose not necessarily that it meets the documented requirements.
Predictable Delivery means that we are working toward a predictable release date using a predictable release process. We know how much work we can get done in a given sprint and we know how much work is needed, so we can use this to provide a reliable estimate.
Frequent Feedback encourages demonstrating functionality early and often then listening for feedback. This includes getting stakeholders access to environments so they can test as we go along.
Finally, Embracing Change encourages us to constantly look for ways to improve. When Impediments to change arise (Fear, Uncertainty, and Doubt) look deeper to the source of these emotions and address them. Evaluate why successes are successful and look for causes of failures. Review, address, and update the process.
Here we focused on ensuring users have a great User Experience when they are using your solutions. User Experience is more than how it looks, this includes all interactions with the tool and the processes surrounding it.
User Experience is how a customer interacts with the tool. User Interface is part of this but specific to layout, visual design and branding.
It may be difficult to get a full view of the User Experience because we can lose sight of the goals of the user and instead rely on assumptions. So it is important to work closely with the actual end users to see their day-to-day processes and get their thoughts on what you are working on.
The 7 factors that influence user experience are valuable, useful, usable, findable, credible, desirable and accessible.
So as you can see there was lots of valuable sessions for me at this event. Who else attended? What were your top takeaways?