Showing posts with label Admin. Show all posts
Showing posts with label Admin. Show all posts

Wednesday, April 17, 2024

Copilot for Power BI

About

Copilot for Power BI is a feature introduced by Microsoft to provide users with AI-powered assistance while creating reports and dashboards in Power BI Desktop. It uses natural language processing (NLP) capabilities to understand user queries and provide relevant suggestions and guidance in real-time as users build their data visualizations.

This feature is designed to make it easier for users, especially those who may not have extensive experience with data analytics or Power BI, to create effective and insightful reports. Copilot can suggest visualizations, recommend data transformations, and offer explanations or insights into the data being analyzed.

This feature need to be enabled in the tenant settings of the Admin Portal either at Org level or can be based on AD groups.

Pricing/Consumption

Requests to Copilot consume Fabric Capacity Units (CU). Copilot usage is measured by the number of tokens processed. Tokens can be thought of as pieces of words. As a reference, 1,000 tokens approximately represent 750 words. The Fabric Copilot cost is calculated per 1,000 tokens, and input and output tokens are consumed at different rates. This table defines how many CUs are consumed as part of Copilot usage.  

Operation in Metrics App 

Description 

Operation Unit of Measure 

Consumption rate 

Copilot in Fabric 

The input prompt 

Per 1,000 Tokens 

400 CU seconds 

Copilot in Fabric 

The output completion 

Per 1,000 Tokens 

1,200 CU seconds 

Capacity Units (CU) in pricing refer to the computing resources allocated to your Power BI deployment in the cloud. They are a measure of the performance and scalability of your Power BI environment.

If you’re utilizing Copilot for Power BI and your request involves 500 input tokens and 100 output tokens, then you’ll be charged a total of (500*400+100*1,200)/1,000 = 320 CU seconds in Fabric.

The cost of Fabric Capacity Units can vary depending on the region. Regardless of the consumption region where GPU capacity is utilized, customers are billed based on the Fabric Capacity Units pricing in their billing region.

For example, if a customer’s requests are mapped from region 1 to region 2, with region 1 being the billing region and region 2 being the consumption region, the customer is charged based on the pricing in region 1.

Data Security

The data such as prompts, grounding data included in prompts, and AI output will be processed and temporarily stored by Microsoft and may be reviewed by Microsoft employees for abuse monitoring.

To generate a response, Copilot uses:

  • The user's prompt or input and, when appropriate,

  • Additional data that is retrieved through the grounding process.

This information is sent to Azure OpenAI Service, where it's processed and an output is generated. Therefore, data processed by Azure OpenAI can include:

  • The user's prompt or input.

  • Grounding data.

  • The AI response or output.

Grounding data may include a combination of dataset schema, specific data points, and other information relevant to the user's current task. Review each experience section for details on what data is accessible to Copilot features in that scenario.

Interactions with Copilot are specific to each user. This means that Copilot can only access data that the current user has permission to access, and its outputs are only visible to that user unless that user shares the output with others, such as sharing a generated Power BI report or generated code. Copilot doesn't use data from other users in the same tenant or other tenants.

Copilot uses Azure OpenAI—not OpenAI's publicly available services—to process all data, including user inputs, grounding data, and Copilot outputs. Copilot currently uses a combination of GPT models, including GPT 3.5. Microsoft hosts the OpenAI models in Microsoft's Azure environment and the Service doesn't interact with any services by OpenAI (for example, ChatGPT or the OpenAI API). Your data isn't used to train models and isn't available to other customers.

Copilot in Power BI Service for Consumers

 

New Report/Page

Create a blank report by picking any published semantic model. In the copilot tab you can give the input prompt that best describes the business needs that you want to fulfill for the report that you want to create.

image-20240417-081539.png
image-20240417-081636.png

The semantic model that we are intended to use for creating any new report needs to be built with all best practices to get the best out of the copilot. The terminologies used in the semantic model and the naming conventions used should be user friendly so that Azure Open AI can recognize.

 

Summarize Visuals

You can summarize the report visuals into bullet points or you can generate high level gist of the contents for quick references. Most of the cases these quick insights generated on top of the existing reports can be used for executive reporting and for presentations.

image-20240417-084101.png

Copilot in Power BI Desktop for Developers

If you are unable to see the copilot in the tool bar in Power BI desktop, then you have to explicitly enable it from the Options as in below

image-20240417-090047.png
image-20240417-085615.png

EU Data Boundary

Customers can configure their service to be in-scope for the EU Data Boundary by provisioning their tenant and all Microsoft Fabric capacities in an EU datacenter location. Customer Data and pseudonymized personal data is stored and processed in the EU Data Boundary aside from specific residual transfers that are documented in Services that transfer a subset of Customer Data or pseudonymized personal data out of the EU Data Boundary on an ongoing basis.

Fabric also enables the option to select an Azure region where Customer Data is stored when creating new Microsoft Fabric capacity. The default option listed is your tenant home region. If you select that region, all associated data, including Customer Data, is stored in that Geo. If you select a different region, some Customer Data is still stored in the home Geo. By selecting a region in the EU, Customer Data will be stored in the EU Data Boundary.

MS Fabric Capacity Metrics

Starting from February 2024, you can view the total capacity usage for Copilot under the operation name “Copilot in Fabric” in your Fabric Capacity Metrics App.


K@run@

Sunday, December 10, 2023

Extracting Private key and certificate from .PEM file using OpenSSL


OpenSSL, a popular open-source tool for working with cryptographic operations, can be used to extract the.pem and.key files from a.pfx (or.p12) file. 
The steps are as follows:


Extraction of.pem and.key from.pfx File:

Install OpenSSL as follows:

Check that your system has OpenSSL installed. You can get it from the OpenSSL website or install it with a package manager such as Homebrew on macOS or apt-get on Linux.



Conversion Method:

OpenSSL is used to extract the private key and certificate from the.pfx file.

Code:-
openssl pkcs12 -in yourfile.pfx -nocerts -out privatekey.key

Change yourfile.pfx to the name of privatekey.key file. The private key will be extracted and saved to privatekey.key using this command.


Code:-
openssl pkcs12 -in yourfile.pfx -clcerts -nokeys -out certificate.pem

This command will extract the certificate and save it to certificate.pem.


K@run@

Saturday, September 9, 2023

Migration of Traditional Reports to Power BI - Quick Guide

Moving reports from traditional analytical tools to Power BI represents a significant stride in harnessing advanced visualization and data analysis capabilities. Below is an initial guide to assist you in the process.

Evaluation and Strategy:
  1. Ascertain which reports to transfer, prioritizing them based on their business significance and complexity.
  2. Gain insight into the data sources utilized in current reports. Determine if they are compatible with Power BI or proceed with prerequisites and pre-processing to achieve the state.
  3. Clearly outline the migration's scope and objectives. Consider whether you're merely relocating reports or if there are opportunities to enhance them using Power BI's functionalities.
Data Readiness:
  1. Ensure that your data sources possess appropriate structure and have been thoroughly cleansed. Power BI performs optimally with well-organized, clean data.
  2. If your data resides in a database, confirm that you possess the requisite credentials and permissions to access it via Power BI.
Power BI Desktop:
  1. Download and install Power BI Desktop if you haven't already. This is the tool you'll employ to design and craft your reports.
  2. Launch Power BI Desktop and acquaint yourself with its interface and capabilities.
Establishing Data Connections:
  1. Within Power BI Desktop, navigate to the "Home" tab and select "Get Data." Choose the relevant data source (e.g., Excel, SQL Server).
  2. Connect to your data source by supplying the necessary connection particulars.
  3. Opt for data import or establish a direct connection, depending on your data source and specific requirements.
Data Transformation:
  1. Utilize Power Query Editor (accessible from the "Home" tab) to modify and structure your data as required.
  2. Undertake data cleansing, establish calculated columns, and implement any necessary transformations.
Report Development:
  1. Construct your reports by dragging and dropping visuals onto the canvas.
  2. Customize the visuals using the formatting choices found in the "Format" and "Visualizations" panels.
  3. Establish relationships between tables if your data necessitates such connections.
DAX Formulas:
  1. Familiarize yourself with Data Analysis Expressions (DAX), Power BI's formula language used for crafting custom calculations.
  2. Compose DAX calculations to derive insights that may not be readily accessible within your data source.
Report Design:
  1. Concentrate on creating reports that are visually captivating and easily comprehensible.
  2. Exploit Power BI's array of visualization options to effectively convey insights.
Interactive Elements:
  1. Leverage Power BI's interactive capabilities, including drill-through functionality, slicers, and bookmarks, to augment user engagement.
Publication:
  1. After finalizing your report, save it as a Power BI Desktop (.pbix) file.
  2. Log in to your Power BI account (or sign up for a free account if you don't have one).
  3. Select the "Publish" button to upload your report to the Power BI service.
Sharing and Collaboration:
  1. Configure suitable permissions to govern report access. Use AD groups to provide access to the workspaces and any published App for the business.
  2. Disseminate the report to colleagues and stakeholders via direct links, embedded reports, or dashboards. 
Maintenance and Oversight:
  1. Make sure the scheduled refreshes are working fine and the data quality checks are in place.
  2. Monitor report usage and gather feedback to facilitate continuous enhancement.

Note:- 
Bear in mind that transitioning reports to Power BI might entail a learning curve, especially for newcomers to the platform. Exercise patience and make the most of the abundant online resources, encompassing tutorials, forums, and documentation, to aid in your journey.

Learning:
For starting the Power BI journey as a Novice , refer to my previous post on training/learning library

Friday, November 29, 2019

Unable to open session logs from Load plan executed in ODI 11g

 Hello there,

I have faced this strange issue which is related to ODI 11g navigation and would like to discuss about the fix that we used in our Linux environment.

We have been using ODI 11g since 5 years and as the days pass by the no of jobs and no of loadplans got increased dramatically and hence the session logs sequence number is very high.

Basically the difficulty here will be from the operations team where they have to filter the logs and check the errors (Time consuming task) rather than directly opening the errored session log from the load plan.

Issue and cause:- ODI 11g has a bug where if the sequence generator of the session logs reach to certain threshold then the link between session logs and executed loadploan will break and you wont be able to open the session logs from the load plans.

Fix:- The sequence generators of various elements will be stored in SNP_ID table of the work repository so we need to rest the SNP_SESSION element in the table.

1) Take backup of your work repository.
2) Shut down all ODI services and Agents (Standalone if any).
3) Make sure there are no running sessions in the repository.
4) Purge all logs from the work repository with scenario reports. make sure there are no abandoned sessions left.
5) Connect to work repository and execute the below queries just to make sure all session logs are purged.

##########################################################################


update snp_id set id_next = 1 where id_tbl = 'SNP_SESSION';


/* >>> Delete all the Session-related records, using a combination of DELETE and TRUNCATE SQL commands. <<< */

/* Naming conventions used in this script:
 *
 *   PRD_BIA_ODIREPO is the name of the Work Repository Schema.
 *
 * Replace in this document all the occurrences of PRD_BIA_ODIREPO with the name of schema containing the Work Repository tables.
 */

 /*
 * 0. First switch off the AutoCommit mode
 */
    set autocommit off ;

/*
 * 1. Delete Session parameters and messages
 */

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT where I_TXT in
    (select I_TXT_TASK_MESS from PRD_BIA_ODIREPO.SNP_SESS_TASK_LOG) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT where I_TXT in
    (select I_TXT_STEP_MESS from PRD_BIA_ODIREPO.SNP_STEP_LOG) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT where I_TXT in
    (select I_TXT_VAR from PRD_BIA_ODIREPO.SNP_VAR_SESS) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT where I_TXT in
    (select I_TXT_DEF_T from PRD_BIA_ODIREPO.SNP_VAR_SESS) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT where I_TXT in
    (select I_TXT_SESS_PARAMS from PRD_BIA_ODIREPO.SNP_SESSION) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT where I_TXT in
    (select I_TXT_SESS_MESS from PRD_BIA_ODIREPO.SNP_SESSION) ;
     
/*
 * 2. Delete Session Header parameters and messages
 */
    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT_HEADER where I_TXT in
    (select I_TXT_TASK_MESS from PRD_BIA_ODIREPO.SNP_SESS_TASK_LOG) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT_HEADER where I_TXT in
    (select I_TXT_STEP_MESS from PRD_BIA_ODIREPO.SNP_STEP_LOG) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT_HEADER where I_TXT in
    (select I_TXT_VAR from PRD_BIA_ODIREPO.SNP_VAR_SESS) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT_HEADER where I_TXT in
    (select I_TXT_DEF_T from PRD_BIA_ODIREPO.SNP_VAR_SESS) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT_HEADER where I_TXT in
    (select I_TXT_SESS_PARAMS from PRD_BIA_ODIREPO.SNP_SESSION) ;

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT_HEADER where I_TXT in
    (select I_TXT_SESS_MESS from PRD_BIA_ODIREPO.SNP_SESSION) ;
     
/*
 * 3. Delete Session execution reports
 */

    /* (a). Deactivate the constraints */
    alter table PRD_BIA_ODIREPO.SNP_SESS_TXT_LOG disable constraint FK_SESS_TXT_LOG ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_TASK_LS disable constraint FK_SESS_TASK_LS ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_TASK_LS disable constraint FK_SESS_LS_SEQ ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_TASK_LOG disable constraint FK_SESS_TASK_LOG ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_STEP_LV disable constraint FK_SESS_STEP_LV ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_STEP_LV disable constraint FK_SESS_LV_VAR ;
    alter table PRD_BIA_ODIREPO.SNP_STEP_LOG disable constraint FK_STEP_LOG ;
    alter table PRD_BIA_ODIREPO.SNP_TASK_TXT disable constraint FK_TASK_TXT ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_TASK disable constraint FK_SESS_TASK ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_STEP disable constraint FK_SESS_STEP ;
    alter table PRD_BIA_ODIREPO.SNP_SEQ_SESS disable constraint FK_SEQ_SESS ;
    alter table PRD_BIA_ODIREPO.SNP_VAR_SESS disable constraint FK_VAR_SESS ;
    alter table PRD_BIA_ODIREPO.SNP_PARAM_SESS disable constraint FK_PARAM_SESS ;

    /* (b). Delete */
    truncate table PRD_BIA_ODIREPO.SNP_SESS_TXT_LOG ;
    truncate table PRD_BIA_ODIREPO.SNP_SESS_TASK_LS ;
    truncate table PRD_BIA_ODIREPO.SNP_SESS_TASK_LOG ;
    truncate table PRD_BIA_ODIREPO.SNP_SESS_STEP_LV ;
    truncate table PRD_BIA_ODIREPO.SNP_STEP_LOG ;
    truncate table PRD_BIA_ODIREPO.SNP_TASK_TXT ;
    truncate table PRD_BIA_ODIREPO.SNP_SESS_TASK ;
    truncate table PRD_BIA_ODIREPO.SNP_SESS_STEP ;
    truncate table PRD_BIA_ODIREPO.SNP_SEQ_SESS ;
    truncate table PRD_BIA_ODIREPO.SNP_VAR_SESS ;
    truncate table PRD_BIA_ODIREPO.SNP_PARAM_SESS ;   
    truncate table PRD_BIA_ODIREPO.SNP_SESSION ;
   
    /* (c). Reactivate the constraints */
    alter table PRD_BIA_ODIREPO.SNP_SESS_TXT_LOG enable constraint FK_SESS_TXT_LOG ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_TASK_LS enable constraint FK_SESS_TASK_LS ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_TASK_LS enable constraint FK_SESS_LS_SEQ ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_TASK_LOG enable constraint FK_SESS_TASK_LOG ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_STEP_LV enable constraint FK_SESS_STEP_LV ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_STEP_LV enable constraint FK_SESS_LV_VAR ;
    alter table PRD_BIA_ODIREPO.SNP_STEP_LOG enable constraint FK_STEP_LOG ;
    alter table PRD_BIA_ODIREPO.SNP_TASK_TXT enable constraint FK_TASK_TXT ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_TASK enable constraint FK_SESS_TASK ;
    alter table PRD_BIA_ODIREPO.SNP_SESS_STEP enable constraint FK_SESS_STEP ;
    alter table PRD_BIA_ODIREPO.SNP_SEQ_SESS enable constraint FK_SEQ_SESS ;
    alter table PRD_BIA_ODIREPO.SNP_VAR_SESS enable constraint FK_VAR_SESS ;   
    alter table PRD_BIA_ODIREPO.SNP_PARAM_SESS enable constraint FK_PARAM_SESS ;
     
/*
 * 4. Delete Scenario associated messages
 */
    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT where I_TXT in
    (select I_TXT_STEP_MESS from PRD_BIA_ODIREPO.SNP_STEP_REPORT) ;   

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT where I_TXT in
    (select I_TXT_SESS_MESS from PRD_BIA_ODIREPO.SNP_SCEN_REPORT) ;   
     
 /*
 * 5. Delete Scenario Header associated messages
 */
    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT_HEADER where I_TXT in
    (select I_TXT_STEP_MESS from PRD_BIA_ODIREPO.SNP_STEP_REPORT) ;   

    delete from PRD_BIA_ODIREPO.SNP_EXP_TXT_HEADER where I_TXT in
    (select I_TXT_SESS_MESS from PRD_BIA_ODIREPO.SNP_SCEN_REPORT) ;   
     
 /*
 * 6. Delete Scenario associated reports
 */
    /* (a). Deactivate the constraints */
    alter table PRD_BIA_ODIREPO.SNP_STEP_REPORT disable constraint FK_STEP_REPORT ;
   
    /* (b). Delete */
    truncate table PRD_BIA_ODIREPO.SNP_STEP_REPORT ;
    truncate table PRD_BIA_ODIREPO.SNP_SCEN_REPORT ;

    /* (c). Reactivate the constraints */
    alter table PRD_BIA_ODIREPO.SNP_STEP_REPORT enable constraint FK_STEP_REPORT ;
   
/*
 * 7. Commit the changes
 */
    commit ;



#######################################################################
 

Clear  tmp and Cache folders in ODI server paths, after restart the ODI server will create new TMP folders  automatically.

Now you can start back your ODI services and Agents and you can test by triggering a new load plan.
You willsee the sessions will start with 2.... and now you will be able to open the session logs directly from the executed load plans.

Oracle reference:-
Unable to Open The Scenario Object Using The Link Of The Execution Session ID From Load Plan Session Log In ODI Studio Operator (Doc ID 2222817.1 )


       How To Manually Delete ODI 10g And 11g Sessions And Scenario-Related Reports (Doc ID 424740.1)
 

K@run@