Senior Data Analyst Resume
Indianapolis, IN
SUMMARY
- 6 years of Professional experience as Data Analyst, working on different Data Warehouse Projects
- Experienced in all phases of Software Development Life Cycle (SDLC), on both Agile and WaterfallMethodologies and acted as a key member of Agile/Scrum team to communicate and to progress the project
- Strong experience in Business and Data Analysis, Data Profilingand Data Integration
- Efficient in analyzing and documenting Business Requirement Documents (BRD) and Functional Specification
- Design documents (FSD) along with Use Case Modeling and UML.
- Extensive working experience in creating database objects like Tables, Indexes,Stored Procedures, Functions, Triggers, Sequences, Constraints, PL/SQLblock
- Experienced in writingSQLstatements,PL/SQL code for the database objects such as tables, views, indexes, sequences, procedures, functions, packages, and triggers
- Extensive working experience in Advance PL/SQL, SQL Concepts like collections, Analytic Functions, Materialized Views and Partitioning Concepts
- Experience in understanding Stored Procedures, Stored Functions and Packages using PL/SQL.
- Extensive working experience in Translating Business Requirements into creation of database objects
- Well versed with DataMigration, DataConversions, DataExtraction/ Transformation/Loading (ETL) using DTS, PL\ SQL Scripts
- Experience in designing and developing DataWarehouse applications using ETL and Business Intelligence tools like Informatica Power Center, Datastage, Mainframes SAS, SSAS, SSIS, OLTP, OLAP, Business Objects.
- Experience with DataExtraction, Transforming and Loading (ETL) using various tools such as Data Transformation Service (DTS), SSIS and Bulk Insert (BCP).
- Experience with Relational Databases like Oracle 10g/9i/8i, DB2 ESE, MS SQLServer, DTS, Sybase, Teradata V2R5, MS Access and formats like flat - files, CSV files, COBOL files and XML files
- Good command in modeling using case tools like ERwin, Oracle Designer, Power Designer & E-R/Studio.
- Experience in understanding Stored Procedures, Stored Functions and Packages usingPL/SQL.
- Creating reports using SQL Reporting Services (SSRS) for customized and ad-hoc Queries.
- Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica Power Center 7.1
- Extensively worked on Forward Engineering and Reverse Engineering.
- Reverse engineered the database structure from the existing application using Sybase power designer.
- Experience in extracting datafor creating value added datasets using SQL, R and Python to analyze trends, obtain insights, identify causes for deficient performance and suggest specific actions for improvement.
- Data Analysis using SQL on Oracle, MS SQL Server, DB2 and Teradata.
- Gathered data, reporting and analysis requirements and translated into reporting structures data models, including aggregated tables, pivoted tables, and relational and dimensional (star-schema) marts.
- Coordinated with DBAs, QA and external team for technical needs and to ensure the timely release of project.
- Lead discussions and JAD Sessions with the Business Intelligence and developer team to discuss workable solutions and documenting the Software Requirement Specifications (SRS).
- Excellent communication skills inter personal skills and proven ability to perform both as an individual and as a team player
TECHNICAL SKILLS
Web Technologies: HTML5, CSS3, JavaScript, jQuery, Bootstrap, Visual Basics
Databases: Teradata SQL, MS SQL, MS Access, Oracle, SQLite.
SDLC Methodologies: Agile, Scrum, Waterfall
Operating Systems: Windows, Mac, UNIX
Application Tools: Eclipse (SE, EE, Android), Visual Studio, MySQL Workbench¸ Adobe Dreamweaver, MS Office, Virtual Box, XAMPP/WAMP Server, MS SQL Server.ETL Tools DataStage, Informatica, SSIS, SSAS
Data Modeling Tools: Dimensional DataModeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Conceptual, Logical DataPhysical Modeling, Erwin r7.2/3/4, Visio, Sybase Power Designer.
Reporting: SAP Crystal reports and SQL server reporting service
Office Applications: MS Office (Word, Excel, Visio, PowerPoint,Project)
Scripting: Python, JavaScript
PROFESSIONAL EXPERIENCE
Confidential, Indianapolis, IN
Senior Data Analyst
Responsibilities:
- Developed logical and physical datamodels and created source to target mappings, schema crosswalks, defined ETL informatica to load datafrom source database into the target data warehouse.
- Developed SQL stored procedures to load datainto the DataWarehouse to create a Business Intelligence environment for generating analytical reports from multiple disparate sources
- Perform dataintegration checks, reports anomalies and communicates with DataArchitect for resolution.
- Created PL/SQL program units to include DDL statements using Dynamic SQL.
- Performed dataprofiling and analysis applied various datacleansing rules designed datastandards and architecture / designed the relational models.
- Developed Python programs for manipulating the datareading from various Teradata and convert them as one CSV Files.
- Performed DataAnalysis and DataManipulation of source datafrom SQL Server and other data structures to support the business organization.
- Extensively involved in DataGovernance that involved datadefinition, dataquality, rule definition, privacy and regulatory policies, auditing and access control.
- Proven skill of Business Intelligence (COGNOS) and datavisualization tools (Excel).
- Used SQL to run ad-hoc queries on Oracle/SQL Server and prepare reports to the management, written & validated T-SQL scripts and Stored Procedures for application development.
- Performed dataanalysis for testing that included supporting the ETL systems that interact with both OLTP& DataWarehouse System.
- Created User Stories into Product Backlog Items (PBIs) and lead PBI grooming sessions for Prioritization and Sprint planning.
- Developed highly Complex Stored Procedures, Packages, Functions, Cursors and Materialized Views in PL/SQLDeveloper, Toad in Oracle 10genvironment.
- Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
- Creation of database objects like tables, views, materialized views, procedures, and package using Oracle tools TOAD, PL/SQL Developer and SQL*PLUS
- Extensively worked on Business Requirements Document (BRD), Functional Requirement Documents (FRD), systems requirements specification (SRS),High level design documents (HLDD), and traceability mapping / matrix with the assistance from the business group, and the IT team members.
- Involved profoundly in the GAP Analysis of Employer Rosters and analyzed "AS IS" and "TO BE" scenarios, designed and documented new process flows business process and various business scenarios.
- Maintained metadata (datadefinitions of table structures) and version controlling for the datamodel.
- Coded and debugged the stored Procedures, Packages and Views in Oracle Databases using SQL and PL/SQL, which were called by user-oriented application modules.
- Developed SQL stored procedures to load datainto the DataWarehouse to create a Business Intelligence environment for generating analytical reports from multiple disparate sources.
- Logical Datamodeling created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.
- Actively worked with the Scrum Master & Product Manager to monitor and prioritize product backlog on an ongoing basis to meet release timelines and value to the business.
- Worked towards continuous improvement of Agile and scrum practices.
- Designed and developed Use Cases using UML and Business Process Modeling. Applied Unified Modeling Language (UML) to design Use Case Diagrams, Activity Diagrams and Sequence Diagrams in MS Visio
- Wrote SQL queries to create database views and reports to track customer data.
- Executed SQL queries, to retrieve datafrom different databases, for datavalidation & analysis, researching underlying dataerrors, and generating reports
- Participated in UAT and worked with HP Quality Center for bug/defect tracking.
- Facilitated User Acceptance Testing (UAT) with end users by providing necessary training and setting up testing data, test scenarios, and testing environment.
- Used Erwin for monthly summary datamarts and inventory datamarts.
- Worked with the Implementation team to ensure a smooth transition from the design to the implementation phase.
Environment: Struts Framework 1.1, JSP 1.2, Servlet 2.3, XML, Oracle 11g, DataWarehouse, OLAP, SQL Navigator, SQL Developer, Erwin 4.0, XML, OLTP, MS-Excel 2000, MS-office 2000, Microsoft XP Professional.
Confidential, Long Beach, CA
Sr. Data Analyst
Responsibilities:
- Processed large Excel source datafeeds for Global Function Allocations and loaded the CSV files into Oracle database with SQL Loader utility.
- Documented all the Relative activities in Quality Centre and coordinated with QA team
- Developed the best practices and standards for DataGovernance Processes.
- Performed Datafiltering, Dissemination activities, trouble shooting of database activities, diagnosed the bugs and logged them in version control tool.
- Performed the batch processing of data, designed the SQL scripts, control files, batch file for data loading.
- Performed the dataAccuracy, dataAnalysis, dataQuality checks before and after loading the data.
- Designed and developed the database objects (Tables, Materialized Views, Stored procedures, Indexes),
- Performed the physical database design, Normalised the tables, worked with Denormalised tables to load the datainto fact tables of datawarehouse.
- Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
- Used SQL*Loader to load datafrom external system and developed PL/SQL programs to dump the datafrom staging tables into base tables.
- Extensively wrote SQL Queries (Sub queries, correlated sub queries and Join conditions) for data Accuracy, dataAnalysis and dataExtraction needs.
- Designed star schema with dimensional modeling, created fact tables and dimensional tables.
- Designed and developed the star schema datamodel, Fact Tables to load the datainto Data Warehouse.
- Implemented one-many, many-many Entity relationships in the datamodelling of datawarehouse.
- Developed the E-R Diagrams for the logical database Model, created the physical dataModel with Erwin datamodeler.
- Edited the database SQL files with Vi Editor, used Unix Commands like Cron, Job scheduling, Executing files, Process, Background, Grep etc.,).
- Written PERL Scripts to extract the datafrom Oracle database, checking the database connections, regular expressions etc.,
- Developed the VBA Integration with Excel feeds and SQL database, SQL Extraction, Transformations Excel datainto SQL database.
- Involved in writing Oracle PL/SQL procedures, functions, Korn Shell scripts that were used for staging transformation and loading of the datainto base tables.
- Involved in dataloading and datamigration - Used SQL * Loader to load datafrom excel file into staging table, dataCubes and developed PL/SQL procedures to load datafrom staging table into base tables of datawarehouse.
- TOAD, SQL Developer tools were used to develop programs for executing the queries.
- Worked with SQL Query performance issues. Used index logic to obtain the reliable performance.
- Involved in the monitoring of database performance in Enterprise Manager Console.
- Created the XML control files to upload the datainto datawarehousing system.
Environment: Oracle SQL Developer, MS Access,WINSQL, Ultra edit, Quality Center 8.2., Informatica 8.1/7.1, Informix, Teradata V2R6 Base SAS, VBA, Teradata, Informatica Power Center 4.7, ER/Studio DataArchitect 9.0, SSIS, Excel, Business Objects, Crystal Reports, Sybase Power Designer
Confidential, Denver, CO
Data Analyst
Responsibilities:
- Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.0
- Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.
- Translate business requirements into conceptual, logical datamodels and integration datamodels, model databases for integration applications in a highly available and performance configuration using ER/Studio.
- Created Physical DataModel from the Logical DataModel using Compare and Merge Utility in ER/Studio and worked with the naming standards utility.
- Performed GAP analysis with Teradata MDM and Drive (SQL) datamodels to get clear understanding of requirements.
- Used Base SAS, SAS/STAT, SQL and Macro facility to manipulate demographic, credit history and lifestyle datasets.
- Involved in Teradata SQL Development, Unit Testing and Performance Tuning
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables
- Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
- Tested several stored procedures.
- Validated several Business Objects reports. Reviewed and tested business requests for data and data usage
- Tested the ETL process for both before data validation and after data validation process.
- Tested the messages published by ETL tool and data loaded into various databases
- Responsible for Data mapping testing by writing complex SQL Queries using WINSQL
- The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Prepared and distributed Escalation Report and Daily Metrics through SAP (Business Object Web Intelligence) for the manager for Business improvement purpose.
- Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translated business rules and functionality requirements into ETL procedures using Informatica Power Center.
- Performed Unit Testing and tuned the Informatica mappings for better performance.
- Re-engineer existing Informatica ETL process to improve performance and maintainability.
- Involved in developing many Dash Boards for home, rental and auto insurance.
- Worked with Oracle SQL Developer to develop the SQL scripts.
- Developed PL/SQL programming that included writing Views, Stored Procedures, Packages, Functions and Database Triggers.
- Used Sybase power designer tool for relational database and dimensional data warehouse designs.
- Experience in developing Web applications using Visual Studio.Net, C# and InterSoft Web Grid
- Worked with ETL teams and used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories.
- Automate periodic reports using Excel VBA, SQL server, Hyperion, Essbase, SSIS Package for management team.
- Developed and maintained datadictionary to create metadata reports for technical and business purpose.
Environment: Model N Application, Informatica, Oracle 11g, DB2, UNIX, Toad, Putty, HP Quality Center, SSIS, SSAS, SSRS,ETL,UNIX, ETL, Tableau, Sybase Power Designer
Confidential
Data Analyst
Responsibilities:
- Identified and prioritized strategic themes of the business to align them with IT strategies thereby growing revenue, improving profits through efficient quality of services.
- Created implementation checklist, using MS Excel for turnover to Application Continuity and Production Support.
- Validated and informed the technical design through dataanalysis across multiple layers of data warehouses, source systems and B2B channels
- Identified business process and documented new flows using MS Visio
- Responsible for datamapping and datamediation between the source datatable and target data tables using MS Access and MS Excel.
- Created and Designed logical database schema for datawarehouse (Teradata) environment to prepare for ETL.
- Wrote complex SQL queries to identify granularity issues and relationships between datasets and created recommended solutions based on analysis of the query results
- Identified, suggested and managed improvements in process to meet dataquality criteria.
- Used Erwin dataModeler to create, enhance, and maintain logical and physical models.
- Developed entities and relationships using Sybase Power Designer.
- Performed dataextraction, dataanalysis, datamanipulation and prepared various production and ad hoc reports to support cost optimization initiatives and strategies using R, Excel, Python and Tableau using Machine Learning and Predictive Modeling.
- Used Spark-SQL to perform transformations and actions on data residing in Hive Design and development of hive tables required for tableau
- Developed Tableau visualizations and dashboards using Tableau Desktop.
- Developed Tableau workbooks from multiple data sources using Data Blending
- Developed guidance and standards for the creation of metadata
- Updated the CDM Metadata Tool (Data Dictionary) for the changes coming up with the new requirements.
- Worked with SME and conducted JAD sessions documented the requirements using UML and use case diagrams
- Coordinated with datamodelers in giving requirements for ER diagrams including datatype length and integrity constraints.
- Created and validated the test dataenvironment for Staging area, loading the Staging area with data from multiple sources.
- Created ETL test datafor all transformation rules and covered all the scenarios required for implementing business logic.
- Written several complex SQL queries for validating Cognos Reports.
- Prepared test datain XML files and checked whether datais parsed and loaded to staging tables.
- Experience in testing dataMarts, dataWarehouse/ETL Applications developed in Informatica and Ab Initio using Oracle, DB2, Teradata and UNIX.
- Experience in loading from various datasources like Teradata, Oracle, Fixed Width and Delimited Flat Files.
- Tested the ETL process for both before and after datacleansing process.
- Experience in creating UNIX scripts for file manipulation and profiling.
- Updated metadata repository with the changes in the CCDW.
- Created macros to calculate YTM, cash flow, discount and principal outstanding schedules using Excel/VBA.
- Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
- Used ADO's to retrieve datafrom MS Access database and display the same in Excel.
- With Excel/VBA and MS Access developed, and supported financial applications regarding Fixed Income, Derivatives, Risk Management, Futures, Options, and Interest Rate Swaps
Environment: Oracle 11g, PL/SQL, SQL Loader, SQL Reports, Teradata, DB2, Cognos 8 BI, MS Visio, MS Excel, EDM Teams, EDM Quality Oracle 12c/11g, TOAD, SQL *Developer, SVN Tortoise, ERWIN Data Modeler,AllLink Transact Load, Virtual Source Safe(VSS)
Confidential
Intern-Data Analyst
Responsibilities:
- Created and Implemented Change controls on Production and Archive Databases.
- Active delegation after POC of new implementations to System DBAs (Offshore/Onsite).
- Logical and Physical Database Modeling using CA Erwin 4.1.
- Created Business Requirement Document (BRD) and Functional Requirement Document (FRD)
- Create Mapping documents, ETL technical specifications and various documents related to data migration
- Developed PL/SQL programs, stored procedures for data loading and data validations.
- Involved in Testing Argus safety system 5.0/5.1.
- Developed Oracle and Teradata queries to replace current data ware house reports
- Involved in testing Cognos Custom reports and Out of the box reports using Argus Insight 5.0/5.1.
- Performed GAP analysis of as-is and to be system/processes.
- Designed and developed ETL Test Case for the entire application.
- Executed the ETL Test Cases manually with the help of SQL, SAS and UNIX Scripts.
- Validated Cognos cubes by using SQL scripts in test and prod environment
- Created indexes, rebuild indexes and maintained indexes.
- Worked on Large Data Base processing millions of records Change requests and Source Controls.
- Developed PL/SQL code to implement business rules using procedures, functions
- Worked on export and import of Data, SQL *Loader, UTL File package.
- Extracted data from Flat files and transformed it in accordance with the Business logic mentioned by the client using AllLink Transact Load (Front end Application used for ETL)
- Involved in Unit Testing the code before Deployment.
- Handling CR’s and implementing according to the business rules as per required to the client.
- Involved in UAT testing, Bugs fixing, and the code was sent to Production.
- Involved in developing test plans and test cases based on high-level and detailed designs.
- Involved in Production Support and troubleshooting data quality and integrity issues.
- Coordinated with senior DBA’s in Code Review and changes in the code for the application.
- Used SQL*Loader to move data from flat files into an Oracle database.
- Coordinated with the offshore team in the testing process of the SDA (Signal Detection and Alerts) & CBS (Corporate Business Support) Cognos custom reports.
Environment: MS Excel, MS Word, Agile, MS Visio, SQL server, Oracle 10g, DB2, SAP BODS.
Confidential, Oaks,PA
Sr. Data Analyst
Responsibilities:
- Gathered Requirements and performed Business Analysis and documented all the data after pulling the data from databases.
- Hands-on experience in analyzing the data and writing Custom MySQL queries for better performance, joining the tables and selecting the required data to run the reports.
- Utilized technology such as MySQL and Excel PowerPivot to query test data and customize end-user requests.
- SQL scripts were developed and triggers, cursors are written.
- Gained expertise on DEDUPLICATION where we have removed duplicates in source data using various data quality transforms available in SAP Business Objects Data Services.
- Scheduled the sessions to extract, transform and load data into warehouse database as per Business requirements. Improved the session performance by pipeline partitioning.
- Extensively used ETL and SAP BODS to load data from Oracle, flat files into the target SAP system.
- Re-engineer existing Informatica ETL process to improve performance and maintainability.
- Developed, executed and tested BODS jobs per business requirements.
- SQL Query performance tuning is used to identify tables and understanding the performance of the database tables.
- Mapping of business requirements to Business Data Model and understanding of system analyst in Canonical Mapping.
- SQL Server Reporting Service is used to handle reporting of Designed Hierarchy dimensions.
- Developed Source to Target Data Mapping, Data Profiling, Transformation and Cleansing rules using SAP BODS.