We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

0/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Over 8+ years of Professional Experience in different environments and platforms including Data Analysis, datawarehousing, datamodelling,datamapping, data lineage,dataprofiling, Informatica Data Quality (IDQ), Informatica Data Analyst (IDA), Informatica Power Center and Client - Server applications in Banking Industry.
  • Provided support on production issues including troubleshooting, coordination with IT, and end user communication related to data issues.
  • Highly experienced in Installation, Configuration, and Administration of Informatica Data Quality and Informatica Data Analyst.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter.
  • Experience in transforming unstructured data from various data sources using SSIS like Conditional Split, Lookup, Merge Join and Sort
  • Expertise in Data modelling (Dimensional and Relational) concepts like Star-Schema Modeming, Snowflake Schema modelling, Fact and Dimension tables.
  • Hands-on Experience in Power BI Power Pivot, Power integrated with Share Point and in creating dashboards in Power BI.
  • Customized the variables to user-defined formats using PROC FORMAT in printing reports.
  • Strong knowledge in technology selection by gathering and reviewing customer information, working with technology teams to determine if a custom solution is necessary, and recommending the best option to the customer.
  • Strong IT experience in the field of Data analysis, ETL Development, Data Modeling, and Project Management with experience in Big Data and related Hadoop technologies.
  • Extensive experience in building relationships with diverse clientele, engaging as a liaison/specialist between clients and teams and working under pressure with minimum supervision.
  • Developed List, cross tab, drill through, master-detail, chart and complex reports which involved Multiple Prompts, multi-query, Cubes in Report Studio, Burst Reports against database.
  • Worked with Global Drill- through Definitions to drill through from analysis studio reports to report studio reports (MOLAP - ROLAP)
  • Strong experience in developing Business Intelligence assets using tools like Informatica Power Centre, Informatica Data Quality OBIEE, Tableau, Oracle and others.
  • Very good understanding and experience in User Acceptance Testing, Smoke Testing, Regression Testing, Performance Testing and Functional Testing.
  • Monitored high profile projects to ensure requests are met on time and within sizing, keeping all parties informed of impacts and changes.
  • Pulling and compiling daily reports for record owners and reviewers; with skills in responding to stakeholder requests/questions and managing escalations; and also, continuously providing support to the stakeholder.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Performed numerous Ad-hoc requests, financial reports involving SQL scripts, UNIX, SAS and Teradata.
  • Expertise in Data modelling (Dimensional & Relational) concepts like Star-Schema modelling.
  • Snowflake Schema modelling, Fact and Dimension tables.
  • Expert in TSQL DDL/DML, perform most of the SQL Server Enterprise Manager and Management Studio functionality using T-SQL Scripts and Batches.
  • Deep Understanding and hands on experience on Data Sub setting, Profiling and cloning.
  • Expertise in TSQL DDL/DML, perform most of the SQL Server Enterprise Manager and Management
  • Studio functionality using T-SQL Scripts and Batches.
  • Extensive experience with OLTP/OLAP System and E - R modelling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modelling.

TECHNICAL SKILLS

  • R Python
  • SQL
  • SAS Base
  • Basic proficiency in Scala
  • Stata
  • MATLAB
  • Hadoop
  • MapReduce
  • MS Azure ML
  • Cloudera
  • Spark
  • SAS EM
  • Eclipse
  • Erwin
  • I Python
  • SQL Server
  • Spring Framework
  • Jenkins
  • Maven
  • MySQL
  • Oracle
  • RedShift
  • Tableau
  • MS Excel
  • MS PowerPoint
  • QlikView
  • SAP
  • Microsoft Power BI
  • Business Objects
  • JIRA
  • TFS
  • RTC
  • Git-Hub
  • Windows
  • Mac OS
  • UNIX and LINUX.

PROFESSIONAL EXPERIENCE

Confidential, Chicago, Il

Sr. DATA ANALYST

RESPONSIBILITIES:

  • Collected, cleaned, sanitized, and analysed data in large volume and automated for self-monitoring, self-diagnosing, self-correcting solutions and for optimizing of key processes.
  • Mined data from sources, e.g., Oracle, MS SQL Server, DB2, and Hadoop based data storage systems.
  • Integrated data from different sources such as Azure Blob Storage, Azure SQL Database, Azure data lake, SQL server DB, flat files, MS-Excel etc. in CDM
  • Perform Data Analysis on the analytical data present in AWS S3, AWS Redshift, Snowflake, Teradata using SQL, Python, Spark, Databricks.
  • Analysed historical data nodes and re-engineered and mapped per user requirements.
  • Review normalized schemas for effective and optimum performance tuning queries and data validations in OLTP and OLAP environments.
  • Manipulate and prepare data, extract data from database for business analyst using Tableau.
  • Participates in the development of Enhancement for the current Commercial and Mortgage Securities.
  • Perform segmentation analytics for each campaign using database technologies present both on premise (such as SQL, Teradata, UNIX) and on Cloud platform using AWS technologies and Big Data technologies such as Spark, Python and Databricks.
  • Built dashboards using Tableau and Power BI and presented before clients.
  • Worked on customer requirement analytics and improved overall customer satisfaction.
  • Created various profiles using Informatica Data Explorer (IDE) & Informatica Data Quality (IDQ), from existing sources and shared those profiles with business analysts for their analysis on defining business strategies in terms of activities such as assigning matching scores for different criteria etc.
  • Automate the process to send the data quality alerts to slack channel and email using Databricks, Python and HTML. This will alert users if there are any issues with data.
  • Wrote complex SQL and PL/SQL queries to identify granularity issues and relationships between data sets and created recommended solutions based on analysis of the query results.
  • Created a road map for the client for the planning, developing, and implementing of MDM solutions, enabling consolidation of MDM data following Mergers and Acquisitions.
  • Mined geospatial data (GIS). Used multiple R libraries like ggmap, rgdal, gdal, rgeos, etc.
  • Used R (RStudio) and Python (PySpark) for data cleaning, visualization, and risks & predictive analytics.
  • Mined data from sources, e.g., Oracle, MS SQL Server, DB2, and Hadoop based data storage systems.
  • Proven track record in troubleshooting of ETL mappings and addressing production issues like performance tuning and enhancement.
  • Performed count validation, dimensional analysis, statistical analysis and data quality validation in data migration.
  • Worked on various Swift messages for payments like MT100, MT101, MT103, MT199, MT940, MT950 and MT942.
  • Implemented an MDM process to take strategy to roadmap and design development activities. Delivered MDM roadmap & MDM Architecture.
  • Perform root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.

Confidential, Richmond, VA

DATA ANALYST

RESPONSIBILITIES:

  • Managed functional requirements for interfaces and conversions between other legacy systems to Teradata, MDM, Enhancements, Workflows and Reports for MDM.
  • Involved in Data mapping specifications to create and execute detailed system test plans.
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Assisted in building models using techniques like Regression (linear/logistic), Tree based ensemble methods, Time Series forecasting, KNN, Clustering, SVM and others.
  • Worked on Requirement Analysis, Data Analysis and Gap Analysis of various source systems sitting and coming from multi systems. Responsible for BI Data Quality.
  • Implemented metadata standards, Collibra Data Governance and stewardship, master data management, ETL, ODS, data warehouse, data marts, reporting, dashboard, analytics, segmentation, and predictive modelling.
  • Used RTC (Rational Team Concert by IBM) as an agile tracking tool.
  • Worked as a data engineer and provided end to end support by extracting, analysing, and interpreting data using aggregation functions, making strategic recommendations, and presenting before internal clients.
  • Provided continued maintenance and development of bug fixes for the existing and new Power BI Reports.
  • Developed the required data warehouse model using Star schema for the generalized model.
  • Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
  • Senior level SQL query skills Oracle and TSQL in analysing and validating SSIS ETL database data ware house processes.
  • Perform data comparison between SDP (Streaming Data Platform) real time data with AWS S3 data and Snowflake data using Databricks, Spark SQL, and Python.
  • Used SAS ODS to create output files in different formats including PDF and RTF
  • Collected business requirements to set rules for proper data transfer from Data Source to Data Target in Data Mapping.
  • Used Hive and Sqoop to build tables and pull data from transactional data bases to NoSQL data bases (Accumulo- Hadoop Big table)
  • Identified meaningful insights from chargeback data.

Confidential, Pittsburgh, PA

DATA ANALYST

RESPONSIBILITIES:

  • Analyse Format data using Machine Learning algorithm by Python Scikit-Learn. Experience in python, Jupiter, Scientific computing stack (NumPy, SciPy, pandas and matplotlib).
  • Perform troubleshooting, fixed and deployed many Python bug fixes of the two main applications that were a main source of data for both customers and internal customer service team.
  • Write Python scripts to parse JSON documents and load the data in database.
  • Generating various capacity planning reports (graphical) using Python packages like NumPy, matplotlib.
  • Analysing various logs that are been generating and predicting/forecasting next occurrence of event with various Python libraries.
  • Created Autosys batch processes to fully automate the model to pick the latest as well as the best bond that fits best for that market.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
  • Created Complex Teradata scripts to generate ad-hoc reports that supported and monitored Day to Day.
  • Transformations of requirements into data structures, which can be used to efficiently store, manipulate and retrieve information.
  • Extensively used Informatica Data Quality transformations - Labeller, Parser, Standardizer, Match, Association, Consolidation, Merge, Address Validator, Case Converter, and Classifier.
  • Clean data and processed third party spending data into manoeuvrable deliverables within specific format with Excel macros and python libraries such as NumPy, SQL Alchemy and matplotlib.
  • Create, activate and program in Anaconda environment.
  • Automate different workflows, which are initiated manually with Python scripts and Unix shell scripting.
  • Created various data quality mappings in Informatica Data Quality (IDQ) tool and imported them into Informatica PowerCenter as mappings, mapplets.

Confidential, Troy, MI

DATA ANALYST

RESPONSIBILITIES:

  • Ability to lead the implementation of complex projects/initiatives impacting one or more lines of business.
  • Design, develop, test, and deliver software solutions and maintain technical documentation.
  • Responsible for the governance of the Alteryx server, developing and maintaining data workflows using the Alteryx tool, developing integration from Alteryx into reporting platforms such as Power bi, and supporting regional operations work through the development of metrics and dashboards.
  • Conducting research and analysis of the competitive market environment for innovative business products and services, regulatory/compliance issues, and implications of new implementations.
  • Ensuring compliance and risk management requirements for the supported area are met and working with other stakeholders to implement key risk initiatives.
  • Building and publishing customized interactive Power bi reports and dashboards along with data refresh scheduling.
  • Pulling and compiling daily reports for record owners and reviewers; with skills in responding to stakeholder requests/questions and managing escalations; and also, continuous providing support to the stakeholder
  • Strong Knowledge of complex business problems and provide subject matter knowledge proficiency for technology initiatives.
  • Assist in the creation and validation of key risk metrics and methods and technical analysis and interpretation.
  • Strong knowledge in technology selection by gathering and reviewing customer information, working with technology teams to determine if a custom solution is necessary, and recommending the best option to the customer.
  • Monitoring high profile projects to ensure requests are met on time and within sizing, keeping all parties informed of impacts and changes.

Confidential, Los Angeles, CA

DATA ANALYST

RESPONSIBILITIES:

  • Involved in (Master Data Management) MDM to help the organization with strategic decision making and process improvements. (Streamline data sharing among personnel and departments)
  • Worked on transforming the requirements into data structures, which can be used to efficiently store, manipulate and retrieve information.
  • Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
  • Involved in Creating Dashboards and visualization of different types for analysis, monitoring, management and better understanding of the business performance metrics.
  • Involved in Creating Dashboards Scorecards, views, pivot tables, charts, for further data analysis.
  • Define and communicate project scope, milestones/deliverables and projected requirements from clients.
  • Use Tableau for SQL queried data, and data analysis, generating reports, graphics and statistical analysis.
  • Queried questions up front to determine key issues and foresee potential show-stoppers that might arise later and in-depth understanding of OTC derivatives operations.
  • Involved in Deviling Reports using the SQL advanced techniques like Rank, Row number etc.
  • Analyze data using Tableau for automation and determine business data trends.
  • Involved in Proving guidance for transitioning from Access to SQL Server.
  • Transfer data objects and queries from MS Excel to SQL Server.

We'd love your feedback!