Lead Analyst Resume
Foster City, CA
SUMMARY
- An accomplished Data Architect/Data Engineer/Analyst working over 15 years of hands on experience in designing and executing solutions for complex business problems involving large scale data warehousing, real - time analytics, data mining, data analysis and reporting solutions.
- Experience in architecting highly scalable, distributed systems using different sources as well as designing and optimizing large, multi-terabyte data warehouses
- Integrated Big Data technologies into the overall architecture for Data analysis, Data collection and Data transformation
- Managing, Leading and Mentoring the team
TECHNICAL SKILLS
Databases: Oracle 6.0/7.3/8.0/8 i/9i/10g/11g/12C, Greenplum (PostgreSQL), Vertica, MYSQL
Languages: COBOL, Java, J2EE, SQL, Oracle PL/SQL, Python, PERL & Shell Scripts
DB Tools: DBArtisan, PL/SQL Developer, Sql Developer
Oracle Products: SQL*Plus, PL/SQL, Oracle 8i/9i/10g/11g, Oracle Forms 6i, Export, Import, Oracle SQL*Loader, JDBC, Oracle Warehouse Builder, Oracle ERP (AR, AP, FA, GL, OE, AOL), OLAP, Data Warehousing (Oracle Warehouse Builder), MDX, JDBC
Predictive Analytics Tools: Oracle Data Miner, Alpine Data Labs, Madlib, R, SAS
ETL Tools: Informatica PowerCenter 8.6/8.5/8.1//7. x/6.x/5.x
Other Tools: TOAD, MS Excel, MS PowerPoint
Reporting Tools: Business Objects, Tableau, OBIEE, Oracle Reports, QlikView
Data Modeling: ERWin MS Visio 2000
Job Schedulers/Code Repositories: CRONTAB, AUTOSYS, Mercurial, JIRA, GitHub
PROFESSIONAL EXPERIENCE
Confidential, Foster City, CA
Lead Analyst
Responsibilities:
- Collaborate with stakeholders across the organization to understanding business requirements
- Working with tools like excel etc. for web scraping, automation
- Designing the dashboard on tableau from extracted data(web)
- Scraping webpages using power query
- Demonstrated aptitude with ETL concepts and tools such as Airflow or AWS Glue
- Experience of the AWS data ecosystem (S3, Redshift, etc.)
- Worked on AWS using REDSHIFT database
- Wrote ETL scripts to load data into AWS
- Created dashboard in tableau for the data analysis consuming the data from AWS
- Designing the BI Reports and data analysis using Excel, Tableau and databases
Confidential, San Jose, CA
Senior Expert
Responsibilities:
- Collaborate with stakeholders across the organization to understand and prioritize solutions.
- Designing & Driving Analysis on various ETL processes using perl or python and Vertica scripting for the autosupport data ingestion
- Working with ETL Jobs & Data Analysis and writing adhoc queries in Vertica and PostgreSQL
- Lead data discovery for accurate business definitions of BI Reports
- Created logical and physical diagrams for new requirements for the etl/data visualization
- Used ML algorithms for analytics reporting like forecasting metrics
- Publishing the reports on Tableau Server for the group & individual users
- Worked on automation of ETL processes and scheduling using crontab
- Maintaining the ETL pipelines/data lakes, Production Support and Customer Support escalation rotation etc.
- Schema designing And Business Analysis for the several dashboards (Tableau)
- Migrating the data from several databases and schemas into new schema (Data Migration)
- Resolving Production/Customer support issues
- Wrote several sql scripts for analyzing the data
Confidential, Milpitas, CA
Senior Tech Lead
Responsibilities:
- Involved in doing the Business Analysis and Requirement, writing the technical document and data modeling for Business Intelligence solutions
- Working with stakeholders across the organization to understand and prioritizing tasks/projects.
- Involved in analyzing Credit Data Analysis and Data Mapping etc.
- Wrote ETL Scripts using plsql, Shell and Perl scripts
- Created Tables, Indexes, Partitioned Tables, Materialized Views, Stored Procedures and Packages
- Worked on designing the table layout in hadoop, scripts to write the data into Hadoop and maintaining the ETL Pipelines
- Analyzed the sales data using Machine Learning Algorithms like Regression, Support Vector Machine for prediction using Oracle Data Miner
- Created Dashboards in Tableau on Retail Store Data and Credit Card Data
- Worked on analysis, designing and creating the tables like AB testing etc. in Hadoop, Hive
- Designing and automating dashboard in Tableau for Service Marketing Funnel and Test Optimization eg. A/B testing
- Creation of SQL Scripts, stored procedure, functions and packages in Oracle, and PostgreSQL
- Publishing the reports on Tableau Server for the group & individual users
- Created logical and physical diagrams for new requirements for the etl/data visualization
- Working with Data Analysis and writing adhoc queries in Hadoop-Hive and Hawq
- Worked on automation of ETL processes in Hive and scheduling using Oozie
Confidential, Sunnyvale, CA
Senior ETL Lead/Architect
Responsibilities:
- Involved in doing the Business Analysis and Requirement, writing the technical document and data modeling
- Written Data Mapping Document and created Data Flow Diagram
- Created Data Model for Yidea, Purple Passion and Forecasting application
- Creation of SQL Scripts, stored procedure, functions and packages in Oracle 10g
- Working with Users for gathering the report requirements and validation of reports
- Involved in doing the Data Analysis and Data Mapping for the marketing and forecasting application etc.
- Modified the ETL Scripts written in plsql, Shell and Perl scripts
- Created Tables, Indexes, Partitioned Tables, Materialized Views, Stored Procedures and Packages in Oracle database
- Migrating the sql server and oracle database/schema from several databases and schemas into new schema