Tuesday, November 23, 2021

Python - Script Debug DBMS


# %%
import datetime
from dateutil.relativedelta import relativedelta
#------------------------------------------------------------------------------------------------
#BEGIN Time
t_aFull=""
t_bFull=""
def diff(t_bFull, t_aFull):
    t_diffFull = relativedelta(t_bFull,t_aFull)  # later/end time comes first!
    return '{h}h {m}m {s}s'.format(h=t_diffFull.hours, m=t_diffFull.minutes, s=t_diffFull.seconds)

t_aFull = datetime.datetime.now()
print("========================== ")
print("Time Starting: ",t_aFull)
print("==========================")
#------------------------------------------------------------------------------------------------

########################################Script Body##############################################

# %%
#------------------------------------------------------------------------------------------------
#Script de Tempo End
t_bFull = datetime.datetime.now()
#diff(t_bFull,t_aFull)
print("========================== ")
print("Time Finish: ",t_bFull)
print("==========================")
print("Time Total: ",diff(t_aFull,t_bFull))
print("==========================")
#------------------------------------------------------------------------------------------------

Thursday, November 11, 2021

SQL - scripts DBA

/*query DBAs */ --================================================================================= /*Consulta Tamanho das Tabelas*/ SELECT CASE when t.NAME = '/BIC/FZSD_C23' then 'WS - Vendas Integradas Histórico' when t.NAME = '/BIC/VZSD_C23F' then 'FACTVIEW - INFOCUBE ZSD_C23 ' ELSE '???' END as ETL, t.NAME AS Entidade, COUNT(DISTINCT(p.partition_id)) AS partition_id, SUM(DISTINCT(p.rows)) AS Registros, COUNT(DISTINCT(c.column_id)) AS Qtd_Cols, SUM(a.total_pages) * 8 AS EspacoTotalKB, SUM(a.used_pages) * 8 AS EspacoUsadoKB, (SUM(a.total_pages) - SUM(a.used_pages)) * 8 AS EspacoNaoUsadoKB, MAX(t.modify_date) AS Ultima_Atualizacao, MAX(t.create_date) AS Ultima_Atualizacao2 FROM sys.tables t INNER JOIN sys.all_columns c ON t.OBJECT_ID = c.OBJECT_ID INNER JOIN sys.indexes i ON t.OBJECT_ID = i.object_id INNER JOIN sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id INNER JOIN sys.allocation_units a ON p.partition_id = a.container_id LEFT OUTER JOIN sys.schemas s ON t.schema_id = s.schema_id WHERE upper(t.NAME) LIKE upper( '%logy%' ) --t.NAME NOT LIKE 'xpto%' and --t.is_ms_shipped = 0 and --i.OBJECT_ID > 255 GROUP BY t.Name -- ,p.partition_id ORDER BY Ultima_Atualizacao DESC , 4 desc --Registros DESC ; --================================================================================= /*Consulta Colunas das Tabelas like*/ SELECT distinct t.NAME AS Entidade, c.Name AS Coluna , c.max_length, c.precision, c.object_id FROM sys.tables t INNER JOIN --select * from sys.all_columns c ON t.OBJECT_ID = c.OBJECT_ID INNER JOIN sys.indexes i ON t.OBJECT_ID = i.object_id INNER JOIN sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id INNER JOIN sys.allocation_units a ON p.partition_id = a.container_id LEFT OUTER JOIN sys.schemas s ON t.schema_id = s.schema_id WHERE upper(c.NAME) LIKE upper( '%logy%' ) --t.NAME NOT LIKE 'xpto%' and --t.is_ms_shipped = 0 and --i.OBJECT_ID > 255 order by 1,2 --================================================================================= /*Consulta Global Variables select @@VERSION ; select @@SERVERNAME ; select @@CONNECTIONS ; select @@MAX_CONNECTIONS ; select @@CPU_BUSY ; select @@ERROR ; select @@IDENTITY ; select @@IDLE ; select @@IO_BUSY ; select @@LANGID ; select @@LANGUAGE ; --select @@MAXCHARLEN ; select @@PACK_RECEIVED ; select @@PACK_SENT ; select @@PACKET_ERRORS ; select @@ROWCOUNT ; select @@SPID ; select @@TEXTSIZE ; select @@TIMETICKS ; select @@TOTAL_ERRORS ; select @@TOTAL_READ / @@TOTAL_WRITE ; select @@TRANCOUNT ; */

Wednesday, September 15, 2021

Power BI Service - Power Query - DataFormat.Error: Invalid cell value '#N/A'

 There are two points to be evaluated:

1. If you want to discard the errors

In this case, just run:


  #"Removed errors" = Table.RemoveRowsWithErrors(StepBefore),


2. Whether to adjust errors due to some columns having value for analysis.

You should select the errors to punctually analyze the non-wrong values and then run the following scripts:  


  #"Kept errors" = Table.SelectRowsWithErrors(StepBefore),

  #"Replaced errors" = Table.ReplaceErrorValues(#"Kept errors", {{[ColumnWithError], null}}),

  

After detecting the column in error, you can delete the Step:  

  #"Kept errors" = Table.SelectRowsWithErrors(StepBefore),


I hope help us!!!

Wednesday, July 21, 2021

Power BI with Python 3.7 - Step by Step

 #Commands 

conda --version

python --version

conda info --envs

conda info

conda list

conda env list

conda list -n myenv scipy

conda search jupyter 

conda list

conda search tensorflow 



#Install/Uninstall Commands 


conda install --help (...)  

conda remove -n PyPWBI --all -y 


#IMPORT Environment 


conda create --name PyPWBI python=3.7 

conda deactivate 

conda activate PyPWBI 

source activate PyPWBI (validar...) 


conda install numpy pandas matplotlib seaborn scikit-learn jupyter nose scipy

conda install -c conda-forge squarify

conda install -c conda-forge pandas-profiling

conda install -c conda-forge schedule

pip install speedtest-cli

pip install openpyxl

pip install investpy


pip uninstall matplotlib pillow numpy

pip install matplotlib pillow numpy


pip install yfinance --upgrade --no-cache-dir

Monday, July 5, 2021

Power Query - Connect API REST

1. Example (in parts)


GetAccessToken:
 () => 
let
    url = "https://wwwXPTO.com/Auth/GetApiToken",
    body  = "{ ""user"": ""userXPTO"", ""pwd"": ""123$""} ",
    AccessToken = Json.Document(Web.Contents(url, [Headers=[#"Content-Type"="application/json"], Content=Text.ToBinary(body)])),
    access_token = AccessToken[token]
in
    access_token
========================================================
GetList:
let
  // Get_Data
  Query = () => let
    Source = Json.Document(Web.Contents("https://wwwXPTO.com/GetList", [Headers=[Authorization="Bearer " & GetAccessToken()]])),
    #"Converted to Table" = Record.ToTable(Source),
    Value = #"Converted to Table"{4}[Value]
in
    Value
in
  Query
========================================================
FAT_Table:
let
    Source = GetList(),
    #"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error)
in
#"Converted to Table"
========================================================


2. Example Full

let 
    tenantId = "435435435345435435",
    clientId = "34543543534535345543",
    clientSecret = "35435435435435435435",
    grantType = "client_credentials",
    resource = "https://graph.windows.net/",
    endpointUsers = "/users",
    apiVersion = "?api-version=1.6",
    baseURL="https://login.microsoftonline.com",
    relativePath = "/"&tenantId&"/oauth2/token",
    url = baseURL & relativePath,
   body2 = [
          client_id=clientId,
          grant_type=grantType,
          client_secret=clientSecret,
          resource=resource],
    Source = Json.Document(Web.Contents(url, [Headers=[#"Accept" = "application/json", #"Content-Type"="application/x-www-form-urlencoded;charset=UTF-8"],Content = Text.ToBinary(Uri.BuildQueryString(body2))])), 
    accessToken = Source[access_token],
    tokenType = Source[token_type],
    usersURL = tenantId & endpointUsers &apiVersion,
    fullAccessToken = tokenType&" "&accessToken,
    Users = Json.Document(Web.Contents(resource & tenantId & endpointUsers &apiVersion,[Headers=[ContentType="application/json", Authorization=fullAccessToken, RelativePath=usersURL]])),
    value = Users[value],
    #"Converted to Table" = Table.FromList(value, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"odata.type", "objectType", "objectId", "deletionTimestamp", "accountEnabled", "ageGroup", "assignedLicenses", "assignedPlans", "city", "companyName", "consentProvidedForMinor", "country", "createdDateTime", "creationType", "department", "dirSyncEnabled", "displayName", "employeeId", "facsimileTelephoneNumber", "givenName", "immutableId", "isCompromised", "jobTitle", "lastDirSyncTime", "legalAgeGroupClassification", "mail", "mailNickname", "mobile", "onPremisesDistinguishedName", "onPremisesSecurityIdentifier", "otherMails", "passwordPolicies", "passwordProfile", "physicalDeliveryOfficeName", "postalCode", "preferredLanguage", "provisionedPlans", "provisioningErrors", "proxyAddresses", "refreshTokensValidFromDateTime", "showInAddressList", "signInNames", "sipProxyAddress", "state", "streetAddress", "surname", "telephoneNumber", "thumbnailPhoto@odata.mediaEditLink", "usageLocation", "userIdentities", "userPrincipalName", "userState", "userStateChangedOn", "userType"}, {"odata.type", "objectType", "objectId", "deletionTimestamp", "accountEnabled", "ageGroup", "assignedLicenses", "assignedPlans", "city", "companyName", "consentProvidedForMinor", "country", "createdDateTime", "creationType", "department", "dirSyncEnabled", "displayName", "employeeId", "facsimileTelephoneNumber", "givenName", "immutableId", "isCompromised", "jobTitle", "lastDirSyncTime", "legalAgeGroupClassification", "mail", "mailNickname", "mobile", "onPremisesDistinguishedName", "onPremisesSecurityIdentifier", "otherMails", "passwordPolicies", "passwordProfile", "physicalDeliveryOfficeName", "postalCode", "preferredLanguage", "provisionedPlans", "provisioningErrors", "proxyAddresses", "refreshTokensValidFromDateTime", "showInAddressList", "signInNames", "sipProxyAddress", "state", "streetAddress", "surname", "telephoneNumber", "thumbnailPhoto@odata.mediaEditLink", "usageLocation", "userIdentities", "userPrincipalName", "userState", "userStateChangedOn", "userType"})
in
    #"Expanded Column1"


========================================================


3. example with Body parameters

let

    url = "https://j?????????????????",

    body  = "{ ""hCreationDate"": { ""$gte"": ""2021-08-01T00:00:00.000Z"",""$lte"": ""2021-08-03T00:00:00.000Z"" } } ",

    vData = Json.Document(Web.Contents(url, [Headers=[#"Content-Type"="application/json" , Authorization="Bearer " & GetAcessToken()], Content=Text.ToBinary(body)])),

    results = vData[results],

    #"Converted to Table" = Table.FromList(results, Splitter.SplitByNothing(), null, null, ExtraValues.Error),

    #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"_id", "action", "feedId", "segment", "userId", "hCreationDate"}, {"_id", "action", "feedId", "segment", "userId", "hCreationDate"})

in 

    #"Expanded Column1"





Thursday, May 13, 2021

DAX - CALCULATE Actualized vs. Planned

Measure:
 .Inflation_PLAN = CALCULATE(DIM_Prem_Plan[.Vlr_Inflation_PLAN], FILTER(DIM_Prem_Plan, DIM_Prem_Plan[PK_Plan]IN VALUES(FAT_DB[PK_Plan])))


Tuesday, May 11, 2021

Power Query - Connect API REST API Bearer

 

FNC_Access_Token

() => 

let

    url = "https://XPTO_SITE_TOKEN",

    body  = "{ ""user"": ""XPTO_USER"", ""pwd"": ""XPTO_PASS""} ",

    AccessToken = Json.Document(Web.Contents(url, [Headers=[#"Content-Type"="application/json"], Content=Text.ToBinary(body)])),

    access_token = AccessToken[token]

in

    access_token


FNC_Table_Prep

let

  // Get_Data

  Query = () => let

    Source = Json.Document(Web.Contents("XPTO_SITE_SEARCH", [Headers=[Authorization="Bearer " & Fnc_Access_Token()]])),

    #"Converted to Table" = Record.ToTable(Source),

    Value = #"Converted to Table"{4}[Value]

in

    Value

in

  Query


DIM_Table

let

    Source = Fnc_Table_Prep(),

    #"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),

    #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"_id", "user", "phone", "profile", "hLastUpdate", "active", "activationCode", "userName", "birthday", "gender", "followingCategories", "isProfileComplete", "name", "dob"}, {"_id", "user", "phone", "profile", "hLastUpdate", "active", "activationCode", "userName", "birthday", "gender", "followingCategories", "isProfileComplete", "name", "dob"})

in

    #"Expanded Column1"


Monday, May 10, 2021

Power Query - Block IF THEN ELSE

 let

    Source = (ano as text) => let 

    

        EmptyTable = #table({"/BIC/ZMATERIAL",...}, {}),

        Source = PowerBI.Dataflows(null),


        //THEN

        WorkspacePRO = Source{[workspaceId="564564563466353653665"]}[Data],

        DataflowPRO = WorkspacePRO{[dataflowName=Text.Combine({"MIX_Full_", ano})]}[Data],

        FAT_PricesPRO = DataflowPRO{[entity=Text.Combine({"MIX_FullDetail_", ano})]}[Data],

    

        //ELSE

        WorkspacePPU = Source{[workspaceId="356633456346456456456"]}[Data],

        DataflowPPU = WorkspacePPU{[dataflowName=Text.Combine({"MIX_Full_", ano})]}[Data],

        FAT_PricesPPU = DataflowPPU{[entity=Text.Combine({"MIX_FullDetail_", ano})]}[Data],


        Final_Table = if PRO_PPU = "PRO" then 

        FAT_PricesPRO else 

        FAT_PricesPPU,

        

        Final_Table2 = try Final_Table otherwise EmptyTable

    in 

        Final_Table2

in

    Source

Power Query - Connect Onedrive

 let

Source = SharePoint.Files("https://XXX.sharepoint.com/personal/XXX_com/", [ApiVersion = 15]),
#"Linhas Filtradas" = Table.SelectRows(Source, each Text.Contains([Folder Path], "XPTO_FOLDER")),
Step2 = Table.SelectColumns(#"Linhas Filtradas", {"Content"}),
Step3 = Table.AddColumn(Step2, "Cols", each Excel.Workbook([Content])),
Step4 = Table.RemoveColumns(Step3, {"Content"}),
Step5 = Table.ExpandTableColumn(Step4, "Cols", {"Name", "Data", "Item", "Kind", "Hidden"}, {"Name", "Data", "Item", "Kind", "Hidden"}),
#"Linhas Filtradas1" = Table.SelectRows(Step5, each ([Kind] = "Sheet")),
#"Linhas Filtradas2" = Table.SelectRows(#"Linhas Filtradas1", each [Name] = "Dados"),
Step7 = Table.RemoveColumns(#"Linhas Filtradas2", {"Item", "Kind", "Hidden", "Name"}),
#"Data Expandida" = Table.ExpandTableColumn(Step7, "Data", {"Column1", "Column2", "Column3", "Column4", "Column5", "Column6", "Column7", "Column8", "Column9", "Column10", "Column11", "Column12", "Column13"}, {"Column1", "Column2", "Column3", "Column4", "Column5", "Column6", "Column7", "Column8", "Column9", "Column10", "Column11", "Column12", "Column13"})
in
#"Data Expandida"

Tuesday, April 27, 2021

Power Query - IF THEN ELSE

 let

    Source = PowerBI.Dataflows(null),

    #"Added Custom" = Table.AddColumn(#"Expanded DIM_Prem_Cambio", "Value_USD", each if [Currency] = "USD" then  [Value]   

else if [Currency] = "EUR" then ([Value] * [EUR])   

else if [Currency] = "BRL" then ([Value] * [BRL]) 


else if [Currency] = "CAD" then ([Value] * [CAD]) 

else if [Currency] = "CHF" then ([Value] * [CHF]) 

else if [Currency] = "GBP" then ([Value] * [GBP]) 

else if [Currency] = "KRW" then ([Value] * [KRW]) 

else if [Currency] = "MYR" then ([Value] * [MYR]) 

else if [Currency] = "RMB" then ([Value] * [RMB]) 

else if [Currency] = "VND" then ([Value] * [VND]) 

else if [Currency] = "INR" then ([Value] * [INR]) 

else if [Currency] = "JPY" then ([Value] * [JPY]) 

else if [Currency] = "CNY" then ([Value] * [CNY]) 

else if [Currency] = "SEK" then ([Value] * [SEK]) 

else 0),

in

    #"Added Custom"

Wednesday, April 21, 2021

DATAFLOW PPU x PRO

Workspace PPU - (USER PPU)

Resource - Create table_PPU (Enable load - OFF) 


Workspace PRO - (USER PPU)

Resource - Create table_PRO_Prep (load table_PPU) (Enable load - OFF)

Load table_PRO (table_PRO_Prep) (Enable load - ON)


Workspace PRO - (USER PRO)

Create PBIX(load table_PRO) (Enable load - ON)

Monday, April 19, 2021

Power Query - Replace Function letter fnc_replace

(Texto as text) =>
let
    ListAccents = 
 {
 {null,"."},
 {"à","a"},
 {"á","a"},
 {"â","a"},
 {"ã","a"},
 {"ä","a"},
 {"è","e"},
 {"é","e"},
 {"ê","e"},
 {"ë","e"},
 {"ì","i"},
 {"í","i"},
 {"î","i"},
 {"ï","i"},
 {"ò","o"},
 {"ó","o"},
 {"ô","o"},
 {"õ","o"},
 {"ö","o"},
 {"ù","u"},
 {"ú","u"},
 {"û","u"},
 {"ü","u"},
 {"À","A"},
 {"Á","A"},
 {"Â","A"},
 {"Ã","A"},
 {"Ä","A"},
 {"È","E"},
 {"É","E"},
 {"Ê","E"},
 {"Ë","E"},
 {"Ì","I"},
 {"Í","I"},
 {"Î","I"},
 {"Ò","O"},
 {"Ó","O"},
 {"Ô","O"},
 {"Õ","O"},
 {"Ö","O"},
 {"Ù","U"},
 {"Ú","U"},
 {"Û","U"},
 {"Ü","U"},
 {"ç","c"},
 {"Ç","C"},
 {"ñ","n"},
 {"Ñ","N"}
 }
in
    Text.Combine(List.ReplaceMatchingItems(Text.ToList(Texto), ListAccents))

Friday, April 16, 2021

Power Query - Python - Load Data with (Param)

 let

    Source = Python.Execute("# In[0]:#(lf)import pandas as pd#(lf)import yfinance as yf#(lf)import investpy#(lf)#(lf)param="&Text.From(P_Y_N)&"#(lf)vdt_BGN = pd.to_datetime('01/01/2021')#(lf)vdt_END = pd.to_datetime('today') + pd.offsets.DateOffset(days=1)#(lf)print(""vdt_BGN:"",vdt_BGN)#(lf)print(""vdt_END:"",vdt_END)#(lf)#(lf)# In[1]:#(lf)df_Tickers = investpy.get_stocks(country='Brazil')#(lf)df_Tickers['AcaoFull'] = df_Tickers['symbol']+'.SA'#(lf)df_Tickers['AcaoName'] = df_Tickers['AcaoFull']+' - '+df_Tickers['name']#(lf)df_Tickers.sort_values(by=['symbol'], inplace=True, ascending=True)#(lf)df_Tickers.drop_duplicates('AcaoFull') #(lf)df_Tickers['IndexCol'] = range(1, len(df_Tickers) + 1)#(lf)df_Tickers#(lf)#(lf)# In[2]:#(lf)update_list = [""BRKM3"",""TWTR34"",""EBAY34"",""JNJB34""]#(lf)df_Tickers['APO'] = 1#(lf)df_Tickers.loc[df_Tickers['symbol'].isin(update_list), 'APO'] = 0#(lf)df_Tickers.sort_values(by=['APO'], inplace=True, ascending=True)#(lf)#(lf)if param == 0:#(lf)    df_Tickers = df_Tickers[df_Tickers['APO'] == 0]#(lf)    #(lf)df_Tickers#(lf)#(lf)# In[3]:#(lf)#datasetTEMP1 = pd.DataFrame()#(lf)#(lf)for num, t in enumerate(df_Tickers['AcaoFull'], start=1):#(lf)  try: #(lf)#    dataset_temp=  yf.Ticker(t).history(period=""1y"")#(lf)    dataset_temp=  yf.Ticker(t).history(start=vdt_BGN, end=vdt_END)#(lf)    dataset_temp['Acao']=t#(lf)    dataset_temp['Updated']=pd.to_datetime('today')#(lf)    #datasetTEMP1 = datasetTEMP1.append([dataset_temp], sort=False)#(lf)    print(""{} TickerList OK: {}"".format(num, t))#(lf)  except Exception as e:#(lf)    print('TickerList Nok {}'.format(t))#(lf)    print(getattr(e, 'message', str(e)))#(lf)    #(lf)dataset = pd.DataFrame() #(lf)dataset = dataset.append([dataset_temp], sort=False)#(lf)dataset = dataset.reset_index()    #(lf)dataset = dataset.set_index([dataset.index])#(lf)dataset = dataset.fillna(0)                  #(lf)dataset['Adj Close'] = dataset['Close']"),

    dataset = Source{[Name="dataset"]}[Value],

    #"Changed Type2" = Table.TransformColumnTypes(dataset,{{"Date", type datetime}, {"Open", type number}, {"High", type number}, {"Low", type number}, {"Close", type number}, {"Volume", type number}, {"Dividends", type number}, {"Stock Splits", type number}, {"Adj Close", type number}, {"Updated", type datetime}}),

    #"Parsed Date" = Table.TransformColumns(#"Changed Type2",{{"Date", each Date.From(DateTimeZone.From(_)), type date}}),

    //  Old  Dataset

    StepJoin = Table.Combine({#"Parsed Date" , FAT_Tickers_TA_Old}),

    #"Merged Queries" = Table.NestedJoin(StepJoin, {"Acao"}, DIM_Tickers, {".AcaoFull"}, "DIM_Tickers", JoinKind.LeftOuter),

    #"Expanded DIM_Tickers" = Table.ExpandTableColumn(#"Merged Queries", "DIM_Tickers", {"IndexCol", ".AcaoFull"}, {"IndexCol", ".AcaoFull"}),

    #"Changed Type" = Table.TransformColumnTypes(#"Expanded DIM_Tickers",{{"Date", type date}, {"Open", type number}, {"High", type number}, {"Low", type number}, {"Close", type number}, {"Volume", Int64.Type}}),

    #"Added Custom" = Table.AddColumn(#"Changed Type", ".PK_Acao_Date", each Text.PadStart([IndexCol],3,"0") & Number.ToText(Date.Year([Date])) & Number.ToText(Date.Month([Date]),"00") & Number.ToText(Date.Day([Date]),"00")),

    #"Sorted Rows" = Table.Sort(#"Added Custom",{{".PK_Acao_Date", Order.Ascending}}),

    #"Added Index" = Table.AddIndexColumn(#"Sorted Rows", "IndexPK", 1, 1, Int64.Type),

    FAT_Tickers_TA1 = Table.Distinct(#"Added Index", {".PK_Acao_Date"})

in

    FAT_Tickers_TA1

Sunday, April 4, 2021

Python - Task Scheduler

File name (script_python.bat) 

Code:


@echo off 

CALL C:\Users\apo1979\Anaconda3\Scripts\activate.bat PyPWBI

C:/Users/apo1979/Anaconda3/envs/PyPWBI/python.exe "d:/.APO_OneDrive/script_python.py" runserver

pause

Sunday, March 28, 2021

Soccer Data Analysis (companies / platforms)

 Short summary of Soccer Data Analysis

There are many companies / platforms, including one of the big ones:


Opta Sports

https://www.optasports.com/

-Capture data through people who write everything down in certain games ...

-Generally there are between 1800 to 3000 observations ...


There are also other platforms that provide data, such as:

Metrica (game events)

https://metrica-sports.com/


Wyscout (game events)

https://wyscout.com/football-data-api/


SofaScore (buy Opta data)

https://corporate.sofascore.com/


Additionally, there are other very interesting ones that make predictions:


Soccer Predictions

https://projects.fivethirtyeight.com/soccer-predictions/

https://projects.fivethirtyeight.com/global-club-soccer-rankings/


Another interesting material that I started studying is from the youtube channel:

https://www.youtube.com/channel/UCUBFJYcag8j2rm_9HkrrA7w/about



All this material had great help through Matheus Evald XL7


Wednesday, February 24, 2021

DAX - Create Table JOIN / UNION

 FAT_wGalaxy = UNION(

    SELECTCOLUMNS(FAT_wGalaxy_R,"Data",[Data],"Valor",[Valor],"Obs",[Observacao],"Conta",[Conta],"CD",[.ReceitaDespesa],"Classe",[.Classe],"Real_Prev",[Pago_Projetado],"Status",[Status],

    "Data_Pag",if(ISBLANK([Data_PAG]),[Data],[Data_PAG])),

    

    SELECTCOLUMNS(FAT_wGalaxy_D,"Data",[Data],"Valor",[Valor],"Obs",[Observacao],"Conta",[Conta],"CD",[.ReceitaDespesa],"Classe",[.Classe],"Real_Prev",[Pago_Projetado],"Status",[Status],

    "Data_Pag",if(ISBLANK([Data_PAG]),[Data],[Data_PAG]))

    

    )

Monday, February 22, 2021

DAX - LastValue

.qtdObitosPT_Diario =
var AcumToday = FAT_COVID_PT_Matrix[obitos]
var AcumLastday = MAXX(FILTER(FAT_COVID_PT_Matrix, FAT_COVID_PT_Matrix[data_dados]=(EARLIER(FAT_COVID_PT_Matrix[data_dados])-1)),FAT_COVID_PT_Matrix[obitos])
return
if (len(AcumToday) >0 , AcumToday - AcumLastday)

Friday, February 19, 2021

Power Query - Sharepoint.List

 let

//Errado =>
//Source = SharePoint.Tables("https://XXX.sharepoint.com/sites/XXX", [Implementation=null, ApiVersion=15]),

//Correto =>
 Source = SharePoint.Tables("https://XXX.sharepoint.com/sites/XXX", [Implementation="2.0", ViewMode="All"]),

in
  Source


Friday, February 12, 2021

Dataviz tools - Visualizations

 There are many different ways of telling a story, but everything starts with an idea. So to help you get started we've rounded up some of the most awesome data visualisation tools available on the web.

01. Chart Studio

Dataviz tools: Chart Studio

Chart Studio from Plotly can produce some very slick graphics

Make charts, presentations and dashboards with this flexible software. You can perform your analysis using JavaScript, Puthon, R, Matlab, Jupyter or Excel, and there are several options for importing data. The visualisation library and online chart creation tool allow you to make great-looking graphics. 

02. DataHero

Dataviz tools: DataHero

Anyone can use this to make sense of data from multiple services

DataHero enables you to pull together data from cloud services and create charts and dashboards. No technical abilities are required, so this is a great tool for your whole team to use. 

03. Chart.js

Dataviz tools: Chart.js

Chart.js is perfectly suited to smaller projects

Although armed with only six chart types, open source library Chart.js is the perfect data visualisation tool for hobbies and small projects. Using HTML 5 canvas elements to render charts, Chart.js creates responsive, flat designs, and is quickly becoming one of the most poplar open-source charting libraries. Read more about how to use Chart.js here.

04. Tableau

Dataviz tools: Tableau

Create and share data in real time with Tableau

Packed with graphs, charts, maps and more, Tableau Public is a popular data visualisation tool that's also completely free. Users can easily drag and drop data into the system and watch it update in real-time, plus you can collaborate with other team members for quick project turnaround.

05. RAWGraphs

Dataviz tools: RAWGraphs

RAWGraphs is an open web app with a simple interface

Open, customisable, and free to download and modify, RAWGraphs lets users create vector-based data visualisations. Data can be safely uploaded from apps to computers, plus it can be exported as an SVG or PNG and embedded in your webpage.

06. Dygraphs

Dataviz tools: Dygraphs

Help visitors explore dense data sets with JavaScript library Dygraphs

Dygraphs is a fast, flexible open source JavaScript charting library that allows users to explore and interpret dense data sets. It's highly customisable, works in all major browsers, and you can even pinch to zoom on mobile and tablet devices.

07. ZingChart

Dataviz tools: ZingChart

ZingChart lets you create HTML5 Canvas charts and more

ZingChart is a JavaScript charting library and feature-rich API set that lets you build interactive Flash or HTML5 charts. It offers over 100 chart types to fit your data.

08. InstantAtlas

Dataviz tools: InstantAtlas

InstantAtlas enables you to create highly engaging visualisations around map data

If you're looking for a data viz tool with mapping, InstantAtlas is worth checking out. This tool enables you to create highly-interactive dynamic and profile reports that combine statistics and map data to create engaging data visualisations.

09. Modest Maps

Data visualization: Modest Maps

Integrate and develop interactive maps within your site with this cool tool

Modest Maps is a lightweight, simple mapping tool for web designers that makes it easy to integrate and develop interactive maps within your site, using them as a data visualisation tool.

The API is easy to get to grips with, and offers a useful number of hooks for adding your own interaction code, making it a good choice for designers looking to fully customise their user's experience to match their website or web app. The basic library can also be extended with additional plugins, adding to its core functionality and offering some very useful data integration options.

10. Leaflet

Dataviz tools: Leaflet

Use OpenStreetMap data and integrate data visualisation in an HTML5/CSS3 wrapper

Another mapping tool, Leaflet makes it easy to use OpenStreetMap data and integrate fully interactive data visualisation in an HTML5/CSS3 wrapper.

The core library itself is very small, but there are a wide range of plugins available that extend the functionality with specialist functionality, such as animated markers, masks and heatmaps. It's perfect for any project where you need to show data overlaid on a geographical projection.

11. WolframAlpha

Dataviz tools: WolframAlpha

Wolfram Alpha is excellent at creating charts

Billed as a "computational knowledge engine", the Google rival WolframAlpha is really good at intelligently displaying charts in response to data queries, without the need for any configuration. If you're using publicly available data, this offers a simple widget builder to make it really simple to get visualisations on your site.

12. Visualize Free

Dataviz tools: Visualize Free

Make visualisations for free!

Visualize Free is a hosted tool that allows you to use publicly available datasets, or upload your own, and build interactive visualisations to illustrate the data. The visualisations go well beyond simple charts, and the service is completely free, plus while development work requires Flash, output can be done through HTML5.

13. Better World Flux

Orientated towards making positive change to the world, Better World Flux has some lovely visualisations of some pretty depressing data. It would be very useful, for example, if you were writing an article about world poverty, child undernourishment or access to clean water. This tool doesn't allow you to upload your own data, but does offer a rich interactive output.

14. FusionCharts

Dataviz tools: FusionCharts

A comprehensive JavaScript/HTML5 charting solution for your data visualisation needs

FusionCharts Suite XT brings you 90+ charts and gauges, 965 data-driven maps, and ready-made business dashboards and demos. FusionCharts comes with extensive JavaScript API that makes it easy to integrate it with any AJAX application or JavaScript framework. These charts, maps and dashboards are highly interactive, customisable and work across all devices and platforms. 

15. jqPlot

Data visualization: jQPlot

jqPlot is a nice solution for line and point charts

Another jQuery plugin, jqPlot is a nice solution for line and point charts. It comes with a few nice additional features such as the ability to generate trend lines automatically, and interactive points that can be adjusted by the website visitor, updating the dataset accordingly.

16. D3.js

Dataviz tools: D3.js

You can render some amazing diagrams with D3

D3.js is a JavaScript library that uses HTML, SVG, and CSS to render some amazing diagrams and charts from a variety of data sources. This library, more than most, is capable of some seriously advanced visualisations with complex data sets. It's open source, and uses web standards so is very accessible. It also includes some fantastic user interaction support.

17. JavaScript InfoVis Toolkit

Data visualization: JavaScript InfoVis Toolkit

JavaScript InfoVis Toolkit includes a handy modular structure

A fantastic library written by Nicolas Belmonte, the JavaScript InfoVis Toolkit includes a modular structure, allowing you to only force visitors to download what's absolutely necessary to display your chosen data visualisations. This library has a number of unique styles and swish animation effects, and is free to use (although donations are encouraged).


18. jpGraph

Data visualization: jpGraph

jpGraph is a PHP-based data visualization tool

If you need to generate charts and graphs server-side, jpGraph offers a PHP-based solution with a wide range of chart types. It's free for non-commercial use, and features extensive documentation. By rendering on the server, you're guaranteed a consistent visual output, albeit at the expense of interactivity and accessibility.

19. Highcharts

Data visualization: Highcharts

Highcharts has a huge range of options available

Highcharts is a JavaScript charting library with a huge range of chart options available. The output is rendered using SVG in modern browsers and VML in Internet Explorer. The charts are beautifully animated into view automatically, and the framework also supports live data streams. It's free to download and use non-commercially (and licensable for commercial use). You can also play with the extensive demos using JSFiddle.

20. Google Charts

Data visualization: Google Charts

Google Charts has an excellent selection of tools available

The seminal charting solution for much of the web, Google Charts is highly flexible and has an excellent set of developer tools behind it. It's an especially useful tool for specialist visualisations such as geocharts and gauges, and it also includes built-in animation and user interaction controls.

21. Excel

Data visualization: Excel

It isn't graphically flexible, but Excel is a good way to explore data: for example, by creating 'heat maps' like this one

You can actually do some pretty complex things with Excel, from 'heat maps' of cells to scatter plots. As an entry-level tool, it can be a good way of quickly exploring data, or creating visualisations for internal use, but the limited default set of colours, lines and styles make it difficult to create graphics that would be usable in a professional publication or website. Nevertheless, as a means of rapidly communicating ideas, Excel should be part of your toolbox.

Excel comes as part of the commercial Microsoft Office suite, so if you don't have access to it, Google's spreadsheets – part of Google Docs and Google Drive – can do many of the same things. Google 'eats its own dog food', so the spreadsheet can generate the same charts as the Google Chart API. This will get your familiar with what is possible before stepping off and using the API directly for your own projects.

22. CSV/JSON

Data visualization: CSV/JSON

CSV and JSON are both common formats for data

CSV (Comma-Separated Values) and JSON (JavaScript Object Notation) aren't actual visualization tools, but they are common formats for data. You'll need to understand their structures and how to get data in or out of them.

23. Crossfilter

Data visualization: Crossfilter

Crossfilter in action: by restricting the input range on any one chart, data is affected everywhere

As we build more complex tools to enable clients to wade through their data, we are starting to create graphs and charts that double as interactive GUI widgets. JavaScript library Crossfilter can be both of these. It displays data, but at the same time, you can restrict the range of that data and see other linked charts react.

24. Tangle

Data visualization: Tangle

Tangle creates complex interactive graphics. Pulling on any one of the knobs affects data throughout all of the linked charts

The line between content and control blurs even further with Tangle. When you are trying to describe a complex interaction or equation, letting the reader tweak the input values and see the outcome for themselves provides both a sense of control and a powerful way to explore data. JavaScript library Tangle is a set of tools to do just this.

Dragging on variables enables you to increase or decrease their values and see an accompanying chart update automatically. The results are only just short of magical.

25. Polymaps

Data visualization: Polymaps

Aimed more at specialist data visualisers, the Polymaps library creates image and vector-tiled maps using SVG

Polymaps is a mapping library that is aimed squarely at a data visualisation audience. Offering a unique approach to styling the the maps it creates, analogous to CSS selectors, it's a great resource to know about.

26. OpenLayers

Data visualization: OpenLayers

It isn't easy to master, but OpenLayers is arguably the most complete, robust mapping solution discussed here

OpenLayers is probably the most robust of these mapping libraries. The documentation isn't great and the learning curve is steep, but for certain tasks nothing else can compete. When you need a very specific tool no other library provides, OpenLayers is always there.

27. Kartograph

Kartograph

Kartograph's projections breathe new life into our standard slippy maps

Kartograph's tag line is 'rethink mapping' and that is exactly what its developers are doing. We're all used to the Mercator projection, but Kartograph brings far more choices to the table. If you aren't working with worldwide data, and can place your map in a defined box, Kartograph has the options you need to stand out from the crowd.

28. Carto

Data visualization: CartoDB

Carto provides an unparalleled way to combine maps and tabular data to create visualisations

Carto is a must-know site. The ease with which you can combine tabular data with maps is second to none. For example, you can feed in a CSV file of address strings and it will convert them to latitudes and longitudes and plot them on a map, but there are many other users. It's free for students; for everyone else, there are various monthly pricing plans.

29. Processing

Data visualization: Processing

Processing provides a cross-platform environment for creating images, animations, and interactions

Processing has become the poster child for interactive visualizations. It enables you to write much simpler code which is in turn compiled into Java.

There is also a Processing.js project to make it easier for websites to use Processing without Java applets, plus a port to Objective-C so you can use it on iOS. It is a desktop application, but can be run on all platforms, and given that it is now several years old, there are plenty of examples and code from the community.

30. NodeBox

Data visualization: NodeBox

NodeBox is a quick, easy way for Python-savvy developers to create 2D visualisations

NodeBox is an OS X application for creating 2D graphics and visualizations. You need to know and understand Python code, but beyond that it's a quick and easy way to tweak variables and see results instantly. It's similar to Processing, but without all the interactivity.

31. R

Data visualization: R

A powerful free software environment for statistical computing and graphics, R is the most complex of the tools listed here

How many other pieces of software have an entire search engine dedicated to them? A statistical package used to parse large data sets, R is a very complex tool, and one that takes a while to understand, but has a strong community and package library, with more and more being produced.

The learning curve is one of the steepest of any of these tools listed here, but you must be comfortable using it if you want to get to this level.

32. Weka

Data visualization: Weka

A collection of machine-learning algorithms for data-mining tasks, Weka is a powerful way to explore data

When you get deeper into being a data scientist, you will need to expand your capabilities from just creating visualisations to data mining. Weka is a good tool for classifying and clustering data based on various attributes – both powerful ways to explore data – but it also has the ability to generate simple plots.

33. Gephi

Data visualization: Gephi

Gephi in action. Coloured regions represent clusters of data that the system is guessing are similar

When people talk about relatedness, social graphs and co-relations, they are really talking about how two nodes are related to one another relative to the other nodes in a network. The nodes in question could be people in a company, words in a document or passes in a football game, but the maths is the same.

Gephi, a graph-based visualiser and data explorer, can not only crunch large data sets and produce beautiful visualisations, but also allows you to clean and sort the data. It's a very niche use case and a complex piece of software, but it puts you ahead of anyone else in the field who doesn't know about this gem.

34. iCharts

Data visualization: iCharts

iCharts can have interactive elements, and you can pull in data from Google Docs

The iCharts service provides a hosted solution for creating and presenting compelling charts for inclusion on your website. There are many different chart types available, and each is fully customisable to suit the subject matter and colour scheme of your site.

Charts can have interactive elements, and can pull data from Google Docs, Excel spreadsheets and other sources. The free account lets you create basic charts, while you can pay to upgrade for additional features and branding-free options.

35. Flot

Data visualization: Flot

Create animated visualisations with this jQuery plugin

Flot is a specialised plotting library for jQuery, but it has many handy features and crucially works across all common browsers including Internet Explorer 6. Data can be animated and, because it's a jQuery plugin, you can fully control all the aspects of animation, presentation and user interaction. This does mean that you need to be familiar with (and comfortable with) jQuery, but if that's the case, this makes a great option for including interactive charts on your website.

Power BI APP - Delete or Unpublish

 UNPUBLISH DELETE