Data Analyst Resume
Sunnyvale, CA
PROFESSIONAL SUMMARY:
- Have 8 years of experience in Data Analysis, Business Analysis, Data Profiling, Data Integration, Migration,Data governance and Metadata Management, Master Data Management and Configuration Management.
- Worked on Data Quality management platforms such as Intelligent Data Quality by Informatica and Oracle Enterprise Data Quality by Oracle.
- Good Experience with Django, a high - level Python Web framework. Experience object oriented programming (OOP) concepts using Python, Django and Linux.
- Experienced in developing web-based applications using Python, Django, C++, XML, CSS, HTML, JavaScript, Angular JS and JQuery.
- Experienced in developing Web Services with Python programming language.
- Hand on experience with R Language for statistical analysis and data reporting .
- Had the opportunity to work with SharePoint to streamlining management of and access to data.
- Worked with SQL Server Management Studio(SSMS) used for configuring, managing, and administering all components within Microsoft SQL Server.
- Experience in various phases of Software Development life cycle (Analysis, Requirements gathering, Designing, Testing) with expertise in Data Validation,Source to Target mappings,SQL Joins and Data Cleansing.
- Expertise in datamodels, database design development,datamining and segmentation techniques.
- Strong experience inData Analysis,DataProfiling,DataCleansing & Quality,DataMigration,DataIntegration.
- ExperiencedDataModeler with strong conceptual, Logical and PhysicalDataModeling skills,Data Profiling skills, MaintainingDataQuality, creatingdatamapping documents, writing functional specifications, queries.
- Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SAS etc).
- Have experience in Data Modeller. Working in areas of Data Minning and Data quality process.
- Hands-on experience in using software packages such as SAS for analytical modeling anddatamanagement.
- Good Experience on Data archival process to SAS data sets and flat files.
- Good Knowledge on Base SAS, SAS/Stat, SAS/Access, SAS/Graphs and SAS/Macros, SAS/ODS and SAS/SQL in windows Environment.
- Maintain ethical standards with databy ensuring databaseintegrityas well as compliance with legislative, regulatory and accrediting requirements.
- Experienced in performance tuning and optimization for increasing the efficiency of the scripts on large database for fast data access, conversion and delivery.
- Expert in backend testing to verifyDataintegrity andDatavalidation of client server and web based applications using SQL.
- Experience in creating views for reporting purpose which involves complex SQL queries with sub-queries, inline views and multi table joins, with clause and outer joins as per the functional needs.
- Good experience in Teradata, Ab Initio, Business Objects, Crystal Reports, PL/SQL, SAS, MS Excel and MS Access.
- Working experience inagile delivery environments and all phases of Software Development Life Cycle (SDLC).
- Stronganalytical skillswith the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
TECHNICAL SKILLS:
ETL Tools: Informatica, SSIS, DataStage, Ab Initio
Data Modeling: Erwin, Power BI, Data WareHouse
Databases: Teradata, Oracle, MS SQL Server, MS Access, DB2
OLAP Tools: Micro strategy OLAP Suite, Cognos, Business Objects
Languages: SQL, PL/SQL, Unix Shell Scripts, Python
Operating Systems: Windows, Unix, Sun Solaris, Linux
Testing Tools: HP ALM and Quality Center
Domain: Banking, Finance, Retail
PROFESSIONAL EXPERIENCE:
Confidential, Sunnyvale, CA
Data Analyst
Responsibilities:
- Develop andimplementdatabases,datacollection systems,dataanalytics and other strategies that optimize statistical efficiency and quality.
- Built multifunction readmission reports usingpython pandasand Django frame work
- Validated already developedpython reports. Fixed the identified bugs and re-deployed the same.
- Developed appropriate ETL Data Quality routines and mappings using Informatica Data Quality based on the requirements and technical design specifications.
- Prepared scripts to ensure proper data access, manipulation and reporting functions with R programming languages.
- Identified candidate business processes which leverage SharePoint tools to enhance day-to-day office functions using workflows, Info Path forms and Corasworks.
- Utilized SAS and SQL to extractdatafrom statewide databases for analysis.
- Acquireddatafrom primary or secondarydatasources and maintain databases/datasystems.
- Performed Data Analysis using visualization tools such as Tableau, Spotfire, and SharePoint to provide insights into the data.
- Configured Azure platform offerings for web applications, business intelligence using Power BI, Azure Data Factory etc.
- Data flow check with source to target mapping of the data.
- Data matrix creation for mapping the data with the business requirements.
- Data profiling to cleanse the data in the data base and raise the data issues found.
- Performed data analysisand data profiling using complex SQL on various sources systems including Oracle and Teradata.
- Involved with dataprofiling for multiple sources and answered complex business questions by providing data to business users.
- Assisted in mining data from the SQL database that was used in several significant presentations.
- Involved in SSIS packages to extractdatafrom different sources like SQL server, MS Excel, MS Access, transform and then load into Dimension and Fact tables inData Warehouse using SSIS.
- Created DDL scripts for implementing Data Modeling changes. Created ERWIN crystal reports in HTML, RTF format depending upon the requirement,
- Published Data model in model mart, created Skilled in System Analysis, E-R/DimensionalData Modeling, Database Design and implementing RDBMS specific features.
- Identify, analyze, and interpret trends or patterns in complexdatasets.
- Manage and complete ad-hoc requests fordatareports.
- Worked withDataAnalysis Tools (SASEnterprise Miner,SASEnterprise Guide, Tableau 9 and above) to analyze and visualizedatato solvedataAnalysis Problems.
- Responsible for generating Financial Business Reports using SAS Business Intelligence tools (SAS/BI) and also developed ad-hoc reports using SAS Enterprise Guide
Environment: Teradata, SAS, Multiload, Oracle, Unix Shell Scripts, SQL Server, Python ERWIN, MS Office Tools, MS Project, MS Access, Pivot Tables, Windows XP.
Confidential, San Jose, CA
Data Analyst
Responsibilities:
- Documentingdatarequirements and performingdataanalysis, with a clear understanding of differences between businessdataand metadata
- Wrote Python routines to log into the websites and fetch data for selected options.
- Developed Views and Templates with Python and using Django's view controller and template language, Website interface is created.
- Worked with SAS as an ETL tool to extract and load thedatainto Teradata table from exceldatasources.
- Lead the Data Correction and validation process by using data utilities to fix the mismatches between different shared business operating systems.
- Extensive data mining of different attributes involved in business tables and providing consolidated analysis reports, resolutions on a time to time basis
- Reviewed Stored Procedures for reports and wrote test queries against the source system (SQL Server) to match the results with the actual report against theDatamart.
- Interpreted complexdatamapping anddataintegration between two or more applications in a producer/consumer construct.
- Created DDL scripts for implementing Data Modeling changes. Designed Star and SnowflakeDataModels for EnterpriseDataWarehouse using ERWIN.
- Creating and maintaining LogicalDataModel (LDM) and PhysicalDataModel (PDM) forall entities, attributes,datarelationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms.
- ValidatedDatato check for the proper conversion of thedata. Data Cleansing to identify baddataand clean thedata. Data profiling for accuracy, completeness, consistency.
- Involved in Data Reconciliation Process while testing loaded data with user reports.
- Developing and writing SQL or similar codes to performdataanalysis utilizing concepts such as joins/associations with the ability to documentdatamapping derivations & transformations.
- Root cause analysis of data discrepancies between different business system looking at Business rules, data model and provide the analysis to development/bug fix team.
- Lead the Data Correction and validation process by using data utilities to fix the mismatches between different shared business operating systems.
Environment: DataStage, DB2, Sybase, TOAD, SQL Server, TSYS Mainframe, Python ERWIN, SQL, PL/SQL, UNIX, Shell Scripting, XML, XSLT.
Confidential
Data Analyst
Responsibilities:
- Responsible for gathering data migration requirements.
- Identified problematic areas and conduct research to determine the best course of action to correct the data.
- Analyzed problem and solved issues with current and planned systems as they relate to the integration and management of order data.
- Experience in using collections in Python for manipulating and looping through different user defined objects.
- Generated Python Django forms to record data of online users and used PyTest for writing test cases.
- Analyzed reports of data duplicates or other errors to provide ongoing appropriate inter-departmental communication and monthly or daily data reports.
- Monitor for timely and accurate completion of select data elements.
- Monitor data dictionary statistics.
- Involved in analyzing and adding new features of Oracle 10g like DBMS SHEDULER, Create Directory, Data pump, CONNECT BY ROOT in existing Oracle 9i application.
- Archived the old data by converting them in to SAS data sets and flat files.
- Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
- Enhance smooth transition from legacy to newer system, through change management process.
- Trained team members in PL/SQL and provided Knowledge Transfer sessions on Finance and Banking domains.
- Planned project activities for the team based on project timelines using Work Breakdown Structure.
- Created Technical Design Documents, Unit Test Cases.
- Involved in Test case/data preparation, execution and verification of the test results
- Created user guidance documentations.
- Created reconciliation report for validating migrated data.
Environment: UNIX, Shell Scripting, XML Files, XSD, XML, SAS, PL/SQL, Oracle, Python Teradata, Sybase, ERWIN, Toad, Autosys.
Confidential
Data Analyst
Responsibilities:
- Designed and developed tools, techniques, metrics, and dashboards for insights anddatavisualization.
- Drive an understanding and adherence to the principles ofdataquality management including metadata, lineage, and business definitions.
- Used Python to place data into JSON files for testing Django Websites.
- Updated and manipulated content and files by using python scripts.
- Build and execute tools to monitor and report ondataquality.
- Performed Source to Targetdataanalysis anddatamapping.
- Created SQL queries to validateDatatransformation and ETL Loading.
- Define the list codes and code conversions between the source systems and thedatamart.
- Developed logical and physical data models that capture current state/future state data elements and data flows using Erwin.
- Involved in data mapping and data clean up.
- Involved with dataprofiling for multiple sources and answered complex business questions by providing data to business users.
- Analyzed and maintained SAS programs and macros to generate SAS datasets.
- Coded PL/SQL packages to perform Application Security and batch job scheduling.
- Created reconciliation report for validating migrated data.
Environment: UNIX, Shell Scripting, XML Files, XSD, XML, SAS, ERWIN, Oracle, Python Teradata, Toad, Autosys, PL/SQL.