data insights Archives - Indium https://www.indiumsoftware.com/blog/tag/data-insights/ Make Technology Work Thu, 08 Feb 2024 13:34:12 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.3 https://www.indiumsoftware.com/wp-content/uploads/2023/10/cropped-logo_fixed-32x32.png data insights Archives - Indium https://www.indiumsoftware.com/blog/tag/data-insights/ 32 32 Unveiling the Shadows: Understanding the Reach and Possible Security Threats of Your Digital Footprint https://www.indiumsoftware.com/blog/unveiling-the-shadows-understanding-the-reach-and-possible-security-threats-of-your-digital-footprint/ Wed, 31 May 2023 05:57:28 +0000 https://www.indiumsoftware.com/?p=17050 Like any footprint, a digital footprint is nothing but the mark we leave behind in the digital world when we use any application or website over the internet. We may not realise how big our digital footprint is but let us be assured that it’s much greater than we can imagine. Every application collects tonnes

The post Unveiling the Shadows: Understanding the Reach and Possible Security Threats of Your Digital Footprint appeared first on Indium.

]]>
Like any footprint, a digital footprint is nothing but the mark we leave behind in the digital world when we use any application or website over the internet. We may not realise how big our digital footprint is but let us be assured that it’s much greater than we can imagine. Every application collects tonnes of data each and every day, and these data are refined to get better insights into our lives. Companies like Google probably know more about us than we know about ourselves.

In this article, let us take Google applications as an example and see how deeply they have access to your personal life and the role of digital assurance/security testing.

Not so long ago, we did not need a mail ID to start our new Android phone, but these days we need a mail ID to configure a new Android device. What happens when we enter our email address? Google immediately comes to know the model of phone we have bought, and it definitely knows all the previous phones we owned because we might have entered the same email address in those devices as well.

Contacts

The next thing we do is add all our contacts from our previous phone or whatever. This task was very tedious in the past, but these days we can sync our contacts to our email ID and get those contacts to our new device with just a few taps. Any normal user will find this feature helpful as it saves a lot of time, but let’s understand how much data we are providing. We may have saved our father’s and mother’s names as Dad or Mom, by which they can identify our parents; they can know our siblings; as a matter of fact, they can even draw out our entire family tree; they can know our car’s or bike’s brand as we may have the brand’s service person’s number. Just by syncing our contacts, we are revealing a lot of things about ourselves.

Maps 

When we use Google Maps, we search for a location, get directions, and travel to that particular location. If we continue to leave ‘Location’ switched on in our phone, Google will now know every mall, shop, restaurant, and other spots we visit. Based on this data, Google can analyse it and get some ideas about our lifestyle and spending habits.

Payments

Google Pay, often known as GPay, is the company’s own payment app. We get rewarded for our transactions, it’s free to use, and it’s really simple to set up and use. Who would refuse to use such a program? Let’s take a moment to consider the issues involved. We provide Google access to information about our bank accounts, financial situation, spending patterns, and much more. As a result of tracking our financial transactions, Google can now analyse regional cash flows and make predictions about the financial health of countries and regions. It can forecast what month individuals prefer to shop for a particular type of product. For retailers and other businesses, this kind of information is a gold mine.

Also read:  The Ultimate Guide to Understanding IoT Sensing: Everything You Need to Know.

YouTube search

YouTube has become a part of our lives. Whether it be education or entertainment, we rely on YouTube for our needs. People also use it to earn some extra income or even as a full-time income source.

While we surf YouTube, we help YouTube learn about our taste in various fields; for example, one may frequently search for Italian dishes or Western outfit designs. These tastes of ours are recorded on their server, and these data are then used to give us recommendations specifically tailored for us.

Apart from this, let us see various other things YouTube knows about us:

  1. YouTube knows about our health issues; we may have searched “remedies for back pain”, “remedies for neck pain”, “remedies for knee pain”, “solutions for insomnia” or “ways to tackle some sort of addiction,” which clearly conveys our problems to YouTube.
  2. YouTube can guess the dish we are going to cook today, as we may have searched for that recipe.
  3. YouTube might know our plans and destination for the tour, as we may have tried to research our destination on YouTube.
  4. YouTube might know that someone in our friend or family circle is getting married; our YouTube search may be evidence of that.
  5. YouTube knows about your favourite movie or TV series and the genre you are into.
  6. YouTube knows about the skills we have been trying to learn.

Because we turn to YouTube for solutions, it is aware of the majority of our issues. There is a benefit to this as well. The YouTube algorithm assesses all the data it has gathered from us and provides us with the finest recommendations. The advice could be for a similar entertainment video, a product that might be beneficial for our health, or training programmes that can help us improve our skills. This makes us feel like we are being catered to.

Gallery

Let’s check to see if the same is true for the gallery.

Our phones’ Gallery app is more intelligent than ever. It may tag each photo with the location where it was taken, automatically make a collage for us, and highlight old memorable moments by unexpectedly displaying a group of pictures that read “One year ago today.” That’s not all; in the modern era, these apps are able to identify people in photos by their faces. The fact that our phone has learned to identify a person by glancing at their face gives me the creeps, even though this feature may be interesting and important to know about.

While the digital age has brought numerous benefits, it has also exposed us to certain security threats. Here are some common security threats associated with our digital footprints:

1. Identity Theft: Cybercriminals can exploit the information found in our digital footprints, such as personal details, social media posts, and online transactions, to impersonate us and commit identity theft. This can result in financial loss and reputational damage.

2. Phishing Attacks: Digital footprints can provide valuable information to cyber attackers, enabling them to craft sophisticated phishing emails or messages that appear genuine. By tricking individuals into revealing sensitive information, such as login credentials or financial details, attackers can gain unauthorized access to accounts or conduct fraudulent activities.

3. Data Breach: As we saw above, organizations collect and store vast amounts of data from our digital footprints. If these organizations fail to implement robust security measures, cybercriminals can exploit vulnerabilities to gain unauthorized access and steal sensitive data, leading to data breaches. This can result in financial loss, legal consequences, and reputational damage for both organizations and individuals.

4. Location Tracking and Privacy Invasion: Like Google Maps, many other digital platforms and services also track our locations through GPS, Wi-Fi, or IP addresses. If this information falls into the wrong hands, it can be used for stalking, physical threats, or unauthorized surveillance, compromising our privacy and personal safety.

5. Online Harassment and Cyberbullying: Our digital footprints, including social media posts and online interactions, can make us vulnerable to online harassment and cyberbullying. Personal information shared online can be used to harass, intimidate, or defame individuals, causing emotional distress and potential harm.

To mitigate these security threats, it is crucial that we be cautious about the information we share online. We must regularly review privacy settings, use strong and unique passwords, enable two-factor authentication wherever available, and stay updated on the latest cybersecurity practises. Additionally, organisations must also focus on prioritising data security, implementing encryption, conducting regular security audits, and educating employees about potential risks and best practises for protecting sensitive information.

Discover the extent of your digital footprint and take control of your online privacy today!

Click here

Conclusion

The so-called digital footprint is a topic that has only just begun to be explored in this article. Beyond what has been covered here, the origins of our digital footprint go far further back. Should we have any concerns? Do we have to take any action at all? There is no right or incorrect solution to this subject; all I can do is share my viewpoint. Since most of our lives now revolve around the internet, there isn’t much we can do to stop it. We might suddenly be cautious about our internet footprint and the traces we leave behind after reading this post when before reading it, we might not have given it any thought and felt at ease.

Let’s make a straightforward contrast. Before the invention of computers, if we were to live a typical day of going to work, eating supper after work, and returning home, investigators could really follow our footprints. Simply put, it means that no matter the age, we always leave a trace of ourselves behind. We are only able to exercise caution and prevent the internet disclosure of any sensitive information that might endanger us.

 

The post Unveiling the Shadows: Understanding the Reach and Possible Security Threats of Your Digital Footprint appeared first on Indium.

]]>
Pivot or Persevere? Insights from Gartner’s IT Score Benchmarks for Data & Analytics in BFSI https://www.indiumsoftware.com/blog/pivot-or-persevere-insights-from-gartners-it-score-benchmarks-for-data-analytics-in-bfsi/ Thu, 18 May 2023 07:24:18 +0000 https://www.indiumsoftware.com/?p=16894 Navigating the VUCA world can be challenging, especially for the BFSI sector. With, Together it’s a VUCA world, and the industry faces numerous hurdles. Not Without Pitfalls With vast amounts of data to process and analyze, the quality and reliability of financial data can be compromised, leading to errors, inconsistencies, and fraud. To keep up,

The post Pivot or Persevere? Insights from Gartner’s IT Score Benchmarks for Data & Analytics in BFSI appeared first on Indium.

]]>
Navigating the VUCA world can be challenging, especially for the BFSI sector. With,

  • Volatile market conditions,
  • Uncertainty around interest rates and inflation,
  • Complex regulatory requirements, and
  • Ambiguous economic indicators,

Together it’s a VUCA world, and the industry faces numerous hurdles.

Not Without Pitfalls

With vast amounts of data to process and analyze, the quality and reliability of financial data can be compromised, leading to errors, inconsistencies, and fraud. To keep up, financial services organizations must embrace data analytics, AI, and other emerging technologies to optimize operations, reduce costs, manage risks, and deliver value to their customers. Those who fail to do so risk falling behind in the VUCA world of today.

According to Gartner’s “IT Score Benchmarks for Data & Analytics in Banking, Finance, and Insurance,” BFSI organizations are investing heavily in data and analytics capabilities to improve their decision-making, customer engagement, and risk management.

  1. Maturity levels in data and analytics indicate how advanced an organization is in implementing and utilizing data-driven strategies and technologies, from early stages to advanced adoption.
  2. Importance levels reflect the significance of data and analytics activities in driving business value, decision-making, and achieving strategic goals.

However, the report highlights the top 3 biggest gaps between importance and maturity levels, which will be shared exclusively with you. By understanding these levels, organizations can prioritize efforts to improve data quality, implement advanced analytics techniques, and leverage data for informed decision-making, ultimately enhancing their overall data and analytics capabilities.

Challenges to Adopting Data Analytics Transformation in the Banking Industry

Above all, a well-crafted strategy is essential. Without a clear plan of action, investing in data analytics tools and technologies is a waste of time and resources. Here are the “Top 3 Insights from Gartner’s 2023 IT Score Benchmarks for BFSI”

Source: Gartner

Every data analytics effort should be built on a well-defined plan. Without one, you may buy all the technology globally and still spin your wheels. Any effective strategy begins with determining the main goals and objectives you want to accomplish. Not only will this offer you a clear way forward, but it will also make it simpler to interact with key decision-makers. Every company-wide project needs executive buy-in, so adequately explaining what your organization stands to gain through implementing banking analytics may assist in securing a “yes” from those in positions of authority.

Defining which metrics and key performance indicators to measure is difficult if important goals and objectives are not identified. Without a clear vision of what you’re striving for or how to measure success, it becomes more difficult for employees at all levels of your business to support banking analytics projects, which can interfere with organizational change management (OCM) and user adoption efforts. It also prevents you from implementing advanced data analysis in banking as effectively as you might or should: You may ask incorrect questions, preventing algorithms from providing useful insights.

Once you’ve established a solid plan, the next steps are to develop excellent data governance and execute the appropriate technology. Data governance assures that all large data gathered in banking is ethical, compliant, and responsible. It establishes essential requirements for where data is stored, how it is accessed, and how it is utilized — all of which guarantee that your employees are supported and that your data procedures are long-term. This is a target that most businesses in the BFSI sector have not met. But where do we stand on a scale of 1 to 5?

One important aspect missing from our list is setting a clear aspiration. However, we believe every institution should aim to establish analytics as a core business discipline used by decision-makers throughout the organization.

Analytics should be like a reflex, much like the human nervous system, with every part of the bank knowing how to react to specific stimuli. At the same time, different banks may have varying paces in building and training their analytics systems, some nerve paths will already be established, while others need to be developed and taught to react accordingly.

Then, old legacy systems that keep data segregated across multiple lines of business — even from one team member to the next — must be replaced. These divisions need greater work from both consumers and customer-facing team members, slowing operations and severely impacting the customer experience. As a result, banks and other financial institutions must invest in advanced analytics solutions that enable them to address difficulties and provide financial advice in a timely manner.

Overcoming legacy systems and integrating new technologies is no easy task. Let us help you transform your data strategy today.

Learn More

7 Tips for Success

Developing a successful banking analytics strategy requires careful planning and execution. Here are some tips to help set you on the right path:

1. Start Small & Scale Gradually: Instead of trying to take on too much too soon, focus on achieving small wins that can fund future projects while delivering the most significant ROI.

2. Adopt an Iterative Learning Approach: Learn from the experience and adopt an iterative approach to treat each project as an opportunity to improve and learn.

3. Build a Comprehensive Data Ecosystem: Utilize internal & external data sources to build a more comprehensive data ecosystem, providing valuable context and enhancing insights from internal data.

Source: Gartner

4. Right Questions: Determine what information you want to put forward; the right questions that will help you obtain the most meaningful data before committing resources.

5. Choose User-Friendly Solutions: Look for banking analytics solutions that feature intuitive and visually appealing visualizations and dashboards that make data-driven insights easily accessible and understandable.

6. Obtain Executive Buy-In: Executive buy-in is crucial to ensure that employees at all levels of the business are on board with new systems and strategies.

7. Automate Where Possible: Automating low-level service requests saves valuable time and allows employees to focus on high-level requests that drive greater value.

Remember to align your strategy with performance metrics, KPIs, and governance, and assemble a winning team with both data science expertise and industry experience. By following these tips, you can develop a successful banking analytics strategy that drives meaningful insights and improves business outcomes.

Don’t let data privacy and security challenges hold you back. Contact us to learn how we can help you secure your customer data.

Learn More

Energy Shots!

Digital transformation is a never-ending journey for BFSI, as they need to constantly evolve to meet changing customer demands, comply with new regulations, and stay ahead of their competitors. But upgrading legacy systems is not a quick fix and can be complex. Banks must prioritize the most critical areas for modernization and ensure that the transformation is comprehensive, swift, and customer-centric.

To successfully upgrade their core banking systems, banks should:

  1. Create a clear roadmap that aligns with their vision,
  2. Identify key metrics for measuring success, and
  3. Expand their team with domain experts, data specialists, and tech professionals.

They should also prioritize cleaning up their data and ensuring it’s of high quality, conduct training for staff, and document all functional and technical knowledge around the core banking system.

Digital transformation requires careful planning and execution, and banks must be prepared to face new risks and challenges. By following these steps, banks can future-proof their operations, improve customer experiences, and stay competitive in the dynamic financial landscape.

EXCLUSIVE UPDATE – eBook coming soon!

The post Pivot or Persevere? Insights from Gartner’s IT Score Benchmarks for Data & Analytics in BFSI appeared first on Indium.

]]>
Power BI Meta Data extraction using Python https://www.indiumsoftware.com/blog/power-bi-meta-data-extraction-using-python/ Wed, 17 May 2023 09:47:06 +0000 https://www.indiumsoftware.com/?p=16850 In this blog we are going to learn about Power BI.pbit files, Power BI desktop file Meta data, Extraction of Power BI Meta data and saving it as an excel file using .pbit file and a simple Python code using libraries like Pandas, OS, Regex, JSON and dax_extract. What is Power BI and .pbix files?

The post Power BI Meta Data extraction using Python appeared first on Indium.

]]>
In this blog we are going to learn about Power BI.pbit files, Power BI desktop file Meta data, Extraction of Power BI Meta data and saving it as an excel file using .pbit file and a simple Python code using libraries like Pandas, OS, Regex, JSON and dax_extract.

What is Power BI and .pbix files?

Power BI is a market leading business intelligence tool by Microsoft for Cleaning, Modifying and Visualizing raw data to come up with actionable insights. Power BI comes with its own data transformation engine called power query and a formula expression language called DAX (Data Analysis Expressions).

DAX gives power BI the ability to calculate new columns, dynamic measures, and tables inside Power Bi desktop.

By default, Power BI report files are saved with .pbix extension which is a renamed version of a ZIP file which contains multiple components, such as the visuals, report canvas, model metadata, and data.

What is Power BI .pbit file

.pbit is a template file created by Power Bi desktop which is also a renamed version of a ZIP file that contains all the Meta data for the Power BI report but doesn’t contain the data itself. Once we extract .pbit file we get a DataModelSchema file along with other files which contain all the Meta data of a Power BI desktop files.

Later in this blog we will be using these .pbit and DataModelSchema files to extract Power BI desktop Meta data.

What is the Meta data in a Power BI Desktop file

Regarding what you see in the Report View in a Power BI desktop, meta data is everything. You can think of all the information as meta data, including the name, source, expression, data type, calculated tables, calculated columns, calculated measures, relationships and lineage between the model’s various tables, hierarchies, parameters, etc.

We will mainly concentrate on extracting Calculated Measures, Calculated Columns, and Relationships in this blog.

Extraction of Meta data using Python

Python was used to process and extract the JSON from the.pbit file and DataModelSchema. We first converted JSON to a Python dictionary before extracting the necessary Meta data.

Below are the steps we will need to achieve the requirement:

 

1. Exporting .pbix file as .pbit file

There are two ways to save our power BI desktop file as .pbit file.

  • Once we are in Power BI desktop, we have an option to save our file as power BI template(.pbit) file
  • We can go to File–>Export–>Power BI Template and save the .pbit file at the desired directory.

2. Unzipping .pbit file to get DataModelSchema file

We can directly unzip the .pbit file using the 7z-Zip file manager or any other file manager. Once we Unzip the file, we will get a folder with the same name as that of the .pbit file. Inside the folder we will get the DataModelSchema file, we will have to change its extension to .txt for reading in python.

3. Reading .pbit and Data model schema file in python

We have an option to directly read the .pbit file in python using the dax_extract library. Second option to read the text file in python and using the JSON module convert it into a Python dictionary. Code can be found in the GitHub repository link given at the end of this file.

4. Extracting Measures from the dictionary

The dictionary that we get consists details of all the tables as separate lists, Individuals tables have details related to the columns and measures belonging to that table, we can loop on each table one by one and get details of columns, Measures etc. Below is an example of the Python code can be found in the GitHub Repository link given at the end of this file.

  table Number table Name Measure Name Measure Expression
0 5 Query Data % Query Resolved CALCULATE(COUNT(‘Query Data'[Client ID]),’Quer…
1 5 Query Data Special Query Percentage CALCULATE(COUNT(‘Query Data'[Client ID]),’Quer…
2 6 Asset Data Client Retention Rate CALCULATE(COUNT(‘Asset Data'[Client ID]),’Asse…

 

5. Extracting calculated columns from the Dictionary

Like how we extracted the measures we can loop on each table and get details of all the calculated columns. Below is the sample output of the Python code can be found in the GitHub Repository link given at the end of this file.

 

  table no Table Name name expression
6 2 Calendar Day DAY(‘Calendar'[Date])
7 2 Calendar Month MONTH(‘Calendar'[Date])
8 2 Calendar Quarter CONCATENATE(“Q”,QUARTER(‘Calendar'[Date]) )
9 2 Calendar Year YEAR(‘Calendar'[Date])

 

Also Read:  Certainty in streaming real-time ETL

6. Extracting relationships from the dictionary

Data for relationships is available in the model key of the data dictionary and can be easily extracted. Below is the sample output of the Python code can be found in the GitHub Repository link given at the end of this file. 

 

  From Table From Column To Table To Column State
0 Operational Data Refresh Date LocalDateTable_50948e70-816c-4122-bb48-2a2e442… Date ready
1 Operational Data Client ID Client Data Client ID ready
2 Query Data Query Date Calendar Date ready
3 Asset Data Client ID Client Data Client ID ready
4 Asset Data Contract Maturity Date LocalDateTable_d625a62f-98f2-4794-80e3-4d14736… Date ready
5 Asset Data Enrol Date Calendar Date ready

 

7. Saving Extracted data as an Excel file

All the extracted data can be saved in empty lists and these lists can be used to derive a Pandas data frame. This Pandas data frame can be exported as Excel and easily used for reference and validation purposes in a complex model. Below snapshot gives an idea of how this can be done.

Do you want to know more about Power BI meta data using Python? Then reach out to our experts today.

Click here

Conclusion

In this blog we learnt about extracting metadata from .pbit and DataModelSchema file. We have created a Python script that allows users to enter the file location of .pbit and DataModelSchema file and then metadata extraction along with excel generation can be automated. The code can be found on the below GitHub also sample excel files can be downloaded from below GitHub link. Hope this is helpful and will see you soon with another interesting topic.

 

The post Power BI Meta Data extraction using Python appeared first on Indium.

]]>