Product School

AI-Driven Embedded Analytics Part 2: Defining Requirements

Arthur Freeman - headshot

Arthur Freeman

July 23, 2024 - 7 min read

Updated: August 13, 2024- 7 min read

We continue to discuss the new generation of AI-powered embedded analytics with a review of considerations for your product requirements. If you haven't already, check out the first article in this series: AI-Driven Embedded Analytics Part 1: Setting Goals.

The following ten areas are important to address in your Product Requirements Documentation (PRD) in addition to those product considerations specific to your organization, industry, and use cases. 

Together, these will inform your estimated project time frame, budget, and choice between a buy or build approach. To round out the series, we also dive into implementation in AI-Driven Embedded Analytics Part 3: Implementation Plan.

(1) AI Prompt Engineering

For generative AI, in most cases, you will want to choose a closed language model that does not use the information shared in your application for training the large language models (LLMs) available to people outside of your organization and your customers. 

In addition, you may likely want to restrict the scope of the language model’s outputs to prevent people from asking extraneous questions or generating inappropriate content. 

It’s helpful to provide guidance with pre-configured standard questions as well as built-in actions so it is simple to get started and easy to take the next step

(2) Architecture and Containerization 

Your organization likely already has a preferred architecture in terms of choice of cloud provider, a hybrid model, or on-premises. 

If your organization is considering a change, for example, to move to work with multiple cloud providers, it’s helpful to ensure that your embedded analytics and AI support both current and future deployment choices without requiring a major overhaul or replacement.

When working with a vendor for AI-powered embedded analytics, it’s helpful to ask them about their current support and roadmap for containerization.

Containerization is increasingly popular for AI-powered embedded analytics. Containers allow your software developers to deploy applications across multiple environments without rewriting the code. 

Embedded containers are lightweight components that package code and dependencies to enable applications to run on their host systems with low memory and power consumption.

Benefits of containerization for embedded analytics and AI include:

  • Improved developer productivity

  • Lower testing costs 

  • Increased product quality

  • Reduced time to market

  • Fewer quality assurance (QA) problems

(3) Automation and Operational Efficiency 

Automation impacts the time and personnel needed to develop, test, and launch your AI-powered embedded analytics and how you will ease deployment, upgrades, and day-to-day operations. 

For adding AI-powered embedded analytics into your existing application, you will want to leverage your current and planned tech stack; developer, security, and operations (DevSecOps) processes; and continuous integration and development (CI/CD) pipelines.

(4) Compliance and Trust

Ensure that the AI-powered embedded analytics or GenAI solution complies with relevant data privacy regulations such as the General Data Protection Regulation (GDPR) and AI Act in Europe, the California Consumer Privacy Act (CCPA) in the United States, or similar laws in other jurisdictions. 

These include obtaining consent for data collection and processing, protecting user personally identifiable information (PII), implementing appropriate data encryption and security measures, and providing users with control over their data.

Depending on your industry, you may also require compliance for financial regulations, e.g., Sarbanes-Oxley Act or the Payment Card Industry Data Security Standard, or for U.S. healthcare the Health Insurance Portability and Accountability Act (HIPAA).

 One of the key concerns with AI is trust.  AI hallucinations may result in incorrect or incomplete answers. AI can also raise issues of systemic bias. These are important considerations for choosing a GenAI-powered analytics provider and educating your internal users or customers about what to expect.

(5) Data Governance and Quality Management 

Clarify roles and responsibilities for data governance within the organization, including data stewards, data custodians, and data governance committees. 

Assign accountability for data governance activities such as data quality management, metadata management, and compliance oversight. Consider how you will measure and assure data quality.

Consider metadata management processes to catalog and document the metadata associated with embedded analytics datasets, including data definitions, data lineage, and data usage policies. 

When possible, maintain a centralized metadata repository to facilitate data discovery, understanding, and governance.

(6) Explainability and Transparency

Plan to describe the data used to train models and the reasoning behind AI-generated insights. You may also want to explain to users the concept of a GenAI hallucination for why the AI model may occasionally produce an incorrect or incomplete answer.

(7) Monitoring and Management Tools

You will want your embedded AI-powered analytics to provide integrations to your existing or planned developer, security, and operations (DevSecOps) monitoring tools to measure app adoption with a 360-degree management view and real-time insights on usage patterns, error rates and commonly used pathways that people explore and benefit from your product.

(8) Performance and Scalability

One of the most important considerations for performance of AI-powered embedded analytics is the loading speed in an environment when you scale to many concurrent users. This is one reason why you’ll see a lot of attention placed on in-memory capabilities, direct query optimization, and push-down data blending.

 As users, our expectations from consumer applications and consumer devices are that content and services are available in less than 1 second, ideally less than 0.2 seconds (200 milliseconds) for a seamless experience. This has popularized caching commonly requested content, blending data into a single integrated data store, or storing data in-memory. 

These are all ways to reduce or avoid the delay of an API request to pull data from an external database while the user waits impatiently looking at their screen.

(9) Security and Accessibility

As part of a robust DevSecOps framework, you should endeavor to deliver security at scale with governance layers at every stage of development and deployment. 

This starts with multi-factor authentication and single-sign-on integration (SSO) and extends to role-based security filters to control access and protect data integration. 

Avoid shortcuts such as hardcoding usernames and passwords. Standard authentication options include Active Directory, OAuth 2.0, or Lightweight Direct Access Protocol (LDAP).

(10) Usability

Steve Krug offers worthwhile tips for usability in his book Don’t Make Me Think. The goal is that the embedded analytics tell a data-driven story that feels natural in your application so that the user does not need to stop their mental flow to try to figure out where to click or what question to ask in a chat box.


If you would like a PDF version of this three-part article series to share with colleagues, check out the recently published “MicroStrategy Guide to Making Every App Intelligent with Embedded Analytics” authored by Brett Sheppard:

Unlock the Power of Your App with the Guide to Embedded Analytics

Make every app intelligent with embedded analytics.

Get Full Guide
Microstrategy CTA image

Introduction to MicroStrategy for AI-Powered Embedded Analytics

Hundreds of global companies and public sector agencies trust MicroStrategy as an AI-powered embedded analytics partner to deliver customized analytics experiences on top of a secure layer of trusted metadata with options to add natural language discovery, predictive analytics, and geospatial analysis. 

While other BI and AI vendors have been acquired, merged, sold, or divested, MicroStrategy is the world’s largest independent provider of AI-powered pervasive analytics at scale.

Unlike the previous generation of embedded analytics, MicroStrategy makes it possible for you to integrate controls and actions so they feel like part of your apps and systems. MicroStrategy offers several choices for pricing models including white labeling and revenue sharing.

Learn more about how to make every app intelligent with MicroStrategy AI-powered embedded analytics at https://www.microstrategy.com/embedded-analytics.

Updated: August 13, 2024

Subscribe to The Product Blog

Discover where Product is heading next

Share this post

By sharing your email, you agree to our Privacy Policy and Terms of Service