Data Analyst Resume
4.00/5 (Submit Your Rating)
Sunny Vale, CA
SUMMARY
- 6+ years of experience working as a Business Analyst/Data Analyst for different projects
- Understanding of basic statistical methodologies, simple and multiple linear regression, hypotheses testing.
- Proven experience working with large data sets and deriving insights from data using various business intelligence and data analytics tools
- Data warehousing and ETL process experience using Informatica Power Center 9.x/8.x and Oracle, Teradata databases
- Knowledgeable on Data warehousing concepts, data integration using various relational databases and SQL querying for data analysis
- Knowledgeable on creating reports using Tableau Desktop and publishing the report to Tableau server
- Expertise in SQL, Tableau, Omniture, Hive, Advanced Excel, Microsoft office suite
- Good understanding ofDataWarehousing concepts and ETL tools
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators
- Extensive experience in Complex Mappings, Reusable Objects (Mapplets, Lookups, Transformations, Tasks and Sessions), transformations, extractions and loads, connections and Scheduling the Workflows with Worklets and Sessions
- Expertise in performance tuning by identifying the bottlenecks at source, target, mapping, session, and database level. Implemented Type - 1, Type-2 and Type-3 Slowly Changing Dimension methodologies in various projects
- Extensively used Informatica Repository Manager and Workflow Monitor
- Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations
- Experience in analyzing and documenting the ETL process flow for better maintenance
- Experience handling Client Accounts and Client Servicing for End clients
- Handled various analytical projects on category management, assortment planning, competitive analysis and monthly KPI's reporting
- Excellent communication and presentation skills with detail oriented analytical abilities
- Proficient in assessing, processing and drawing conclusions from gathered data
- Strong interpersonal skills and organizational skills
- Knowledgeable in Manufacturing, Retail CPG and e-Commerce domains
- Worked in coordination with Business Analysts, Developers, ETL and Data warehouse engineers to validate ETL scripts, create, execute and debug SQL queries to perform data completeness, correctness, data transformation and data quality testing
TECHNICAL SKILLS
ETL Tools: Informatica
Data Modeling: Star schema, snowflake schema
Databases: Oracle 10g/11g, MySQL, Hadoop, Teradata
Oracle Utilities: Oracle SQL Developer, SQL
Languages: SQL, Unix, Hive, Spark, Python
Reporting Tools: Omniture, Tableau
Operating Systems: Windows, Macintosh
PROFESSIONAL EXPERIENCE
Confidential, Sunny vale, CA
Data Analyst
Responsibilities:
- Collaborated closely with product teams to deliver analyses to consumers and guide product enhancements
- Primarily responsible formapping attributes between independent client systems using a proprietary ‘Attribute mapping’ tool
- Acquired data from various primary/secondary data sources and maintained databases/data systems for data analysis purposes
- Identified, analyzed, and interpreted patterns in complex data using statistical techniques
- Provided ongoing reports, developed and implemented data collection systems and other strategies that optimized statistical efficiency and data quality
- Worked with large volumes of data, tailored analyses and deep dives along with communication of findings with key decision-makers across the organization
- Created end to end mappings for various datasets provided by business partners
- Wrote SQL queries in Hive and Spark to answer business questions and for data validation
- Worked closely with Analytics team in model tuning and assisting them with transaction relateddata
- Involved in data validation, data ingestion and data migration
- Created tickets for engineering team to white list specific catalogs in order to publish data to the website
- Validated multiple images for each product type based on various catalogs
- Actively involved in business partner meetings
- Built Omniture dashboard to understand product assortment and purchase behavior in Babycribs.
Confidential
Data AnalystResponsibilities:
- Helped client reduce annual indirect spends through mining of multi-dimensional data
- Monthly reports generated at the Business Unit, Category and VP level to give detailed break-up of spends. Besides this, other adhoc reports prepared based on analysis of these spends
- Single point of contact between Supplier, client and offshore team for RFQ and PO related concerns
- Worked with Perkins and Geneva partitions to handle data reconciliation requests
- Involved with data profiling for multiple sources and answered business questions by providing data to business users
- Maintained database for inventory, spend and invoice systems and created dynamic stored procedures to retrieve the details related to category wise spend and drop in inventory
- Used SQL tools like SQL Developer, TOAD to run SQL queries to validate data
- Developed and maintained SQL packages, procedures and functions to support the reports by retrieving data
- Created complex mappings in Power Center Designer using Expression, Filter, Sequence Generator, Update Strategy, Joiner and Stored procedure transformations
- Maintained warehouse metadata, naming standards and warehouse standards for future application development
- Involved in creating UNIX jobs and scripts to invoke the ETL workflows
- Used workflow manager to configure workflow and tasks and used workflow monitor to monitor process status
- Recorded the test scenarios and performed unit testing
- Co-ordinated with users to perform User Acceptance Testing.
- Built a spend categorization dashboard to help category managers understand and reduce uncategorized spend.
Confidential
Data AnalystResponsibilities:
- Developed recurring and ad-hoc reports for client by accessing POS data and Panel data from multiple data sources from North America using commercially available OLAP tools
- Provided recommendations to help client understand category performance, performance across various dimensions (brands/markets/channels and the like)
- Performed Data Profiling for analysis of cross database attributes & its relationships along with business metadata & transactional data
- Interacted with business users and business analyst to understand requirements, analyzed business requirements and functional specifications
- Prepared technical specifications/mapping documents for the development of Informatica (ETL) mappings to load data into various target tables
- Worked on data sets with millions of rows, ensuring correctness of codes and identifiers
- Developed Type1, 2 & 3 Slowly Changing Dimensions
- Prepared various mappings to load the data into different stages like staging1, Staging2 and Target tables. Extensively used Mapping parameters, mapping variables to make mappings more flexible and minimizing hard coding in ETL logic
- Involved in analyzing bugs, performance of SQL Queries and provided solutions to improve the same
- Designed the process flows to record dependencies between the mapping runs
Confidential
Data AnalystResponsibilities:
- Responsible for collection of data from all legacy systems and existing data stores
- Configured and Installed Informatica power center 9.16 on Oracle database in Windows
- Created repositories and folders as per the client requirement
- Developed complex mappings using multiple sources and targets in different databases
- Qualifier, Expression, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Router, Filter, Aggregator and Sequence Generator transformations
- Imported data from Relational databases, Legacy Systems, XML files and Flat files to Oracle database using Informatica Power Connect
- Involved inUnit & Iterative testingto check whether data loads into target were accurate
- Responsible for weekly status updates showing the progress of the ETL process.
- Provided support for the applications after production deployment to take care of any post-deployment issues
- Applied partitioning at session level for the mappings which involved loading data to target using target lookup to avoid duplicates records.
- Used various lookup caches like Static, Dynamic, and Persistent in Lookup transformation to identify the incremental data relevant to data flow in ETL process.
- Performance tuning by session partitions, dynamic cache memory, and index cache