We provide IT Staff Augmentation Services!

Sr. Data Analyst/data Modeler Resume

0/5 (Submit Your Rating)

Minneapolis, MN

PROFESSIONAL SUMMARY:

  • Above 8 years of experience in data modeling, business and data analysis, production support, database management, strategic analysis, requirements gathering, application and decision support.
  • Experienced with all major databases: Oracle, SQL Server, Teradata in large Data Warehouse (OLAP) environments.
  • Strong background in various Data modeling tools using ERWIN.
  • Extensive experience in SDLC (Software Development Life Cycle) and good experience in the field of business analysis, reviewing, analyzing, and evaluating business systems and user needs, business modeling, document processing.
  • Efficient in developing Logical and Physical data model and organizing data as per the business requirements using Sybase Power Designer, ERWIN in both OLTP and OLAP applications
  • Comprehensive knowledge and experience in process improvement, normalization/de - normalization.
  • Extensively worked on various RDBMS like Teradata, Oracle, DB2, Netezza and SQL Server.
  • Knowledge and experience in Microsoft Office tools like MS Access, MS SharePoint, MS word, MS PowerPoint, MS Excel.
  • Practical Sound understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Expertise in documenting the Business Requirements Document (BRD), generating the UAT Test Plans, maintaining the Traceability Matrix and aSSISting in Post Implementation activities.
  • Experience in working on Distributed storage for analysis and processing of large data sets using Apache Hadoop .
  • Excellent knowledge on Perl & Unix.
  • Experienced working with Excel Pivot and VBA macros for various business scenarios.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Excellent knowledge on creating reports on SAP Business Objects, Tableau & TIBCO Spotfire.
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
  • Excellent experience in Data mining with querying and mining large datasets to discover transition patterns examine complex data.
  • Solid experience in Database performance tuning, optimization and maintenance.
  • 5 years of extensive experience on InformaticaDataQualityIDQand Informatica Power Center with strong business understanding and knowledge of Extracting, Transforming and Loading ofdatafrom heterogeneous source systems like Oracle, SQL Server, Flat files, Excel, XML, UDB, Sybase.
  • Excellent experience designing both logical and physical data models in large enterprise organizations .
  • Experience using HIVE and working on HIVE QL .
  • In depth understanding of data modeling concepts and techniques as applied in OLTP, ODS and Data Warehouse applications including normalization (3NF), de-normalization, dimensional modeling, OLAP
  • Experience working within a broader team to fill out details of a conceptual design into a detailed logical data model (ERD) and corresponding a physical data model through developed DDL scripts, and oversee the execution and testing of DDL scripts in collaboration with DBAs
  • Excellent knowledge on creating DML statements for underlying statements.
  • Extensive ETL testing experience using Informatica 8.6.1/8.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager).
  • Participating in impact analysis of InformaticaMDM9.10 upgrade on existing configuration.
  • Have good exposure on working in offshore/onsite model with ability to understand and/or create functional requirements working with client and also have Good experience in requirement analysis and generating test artifacts from requirements docs.
  • Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
  • An excellent team player & technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.

TECHNICAL SKILLS:

Data Warehousing: Informatica 9.6/9.5/9.1/8.6/7.1.2 (Repository Manager, Designer, Pentaho (BI), Workflow Manager, and Workflow Monitor), SSIS, Data Stage 8.x

Reporting Tools: Business Objects6.5, XIR2

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Testing Tools: Win Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear Quest

RDBMS: Oracle 12c/11g/10g/9i/8i/7.x, MS SQL Server, UDB DB2 9.x, Teradata V2R6/R12/R13/R14, MS Access 7.0

Programming: SQL, PL/SQL, UNIX Shell Scripting, VB Script

Environment: Windows (95, 98, 2000, NT, XP), UNIX

Other Tools: TOAD, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata V2R6/R12/R13 SQL Assistant

PROFESSIONAL EXPERIENCE:

Confidential, Minneapolis, MN

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Worked with Business Analysts in gathering reporting requirements representingDataDesign team.
  • Created Semantic data models for Pharmaceutical claims data and worked on XML to HL7 parsing.
  • Design of TeradatadataWarehouse tables as per the reporting requirements.
  • Performed data management projects and fulfilling ad-hoc requests
  • Worked at conceptual/logical/physicaldatamodel level using Erwin according to requirements.
  • Worked on non- relational database Apache - Hbase.
  • Configured HDFS to run Hadoop, Hive, Pig and Hbase.
  • Used Reverse Engineering approach to redefine entities, relationships and attributes in thedata model as per new specifications in Erwin after analyzing the database systems currently in use.
  • Worked withDataQuality Team in defining and configuration of Rules, Monitoring and preparation of DataQuality Analysis and Dashboards using Trillium.
  • Designed Oracle based Sources to Targets mappings using Informatica Developer Tool for Datacleansing, validating, integrating and matching using InformaticaDataQualityIDQ.
  • ExtractingdatafromOracleusing SAS.
  • Used MDM in Understanding business needs and problems and converting them into Business Requirements
  • Developed SQL Stored Procedures and Autosys Scheduler to loaddatainto the database.
  • Worked with EmbarcaderoER/Studio Data Architect (data modeling tool) before being deployed into the physical models.
  • Extensively worked on SQL in analyzing the database and querying the database as per the business scenarios.
  • Strong functional knowledge ofMDMArchitecture,DataManager, Materials & ServicesData Model, tables,MDMConsole.
  • IntegratedDataQuality routines in the Informatica mappings to standardize and cleanse data.
  • Extensively worked on InformaticaDataExplorer in creatingdataprofiles with comparative profiling analysis and accuracy of mapping logic.
  • Worked with DBA to create the physical model and tables. Worked on InformaticaDataQuality tool fordatacleansing, conformance and freshness ofdata.
  • Enforced referential integrity in the OLTPdatamodel for consistent relationship between tables and efficient database design.
  • Used HIVE and Pig for running queries and managing large data sets residing in Distributed Storage (Hadoop).
  • DevelopedDataMapping,DataGovernance, Transformation and Cleansing rules for the Master DataManagement Architecture involving OLTP, ODS and OLAP.
  • Developed and maintaineddatadictionary to create metadata reports for technical and business purpose.
  • Worked with DBA to create the physical model and tables. Scheduled multiple brainstorming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes, case by case for the facts and dimensions.

Environment: ER Studio 8.0.3, Teradata R14, Microsoft SQL Management Studio, HP Quality Center, Unix

Confidential, New York, NY

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Used modeling techniques and tools in analyzing and specifying data structure in large and/or complex data sets.
  • Recommended best practices for data transformations to support the data movement. Defined data migration requirements, gathering, validating and documenting data requirements.
  • Translated, merged, cleaned up and/or mapped data into conceptual, logical and physical data models.
  • Developed the sqoop scripts in order to make the interaction betweenPigand MySQL Database.
  • Wrote Stored Procedures in T-SQL (Sybase ASE) after BRD analysis using RapidEmbarcadero.
  • Having good skills of debugging complex design of InformaticaIDQmapping and any production issues.
  • Created thesemanticdata model, taking into consideration any denormalization, materialization etc required. Create a model that would ensure optimum performance to meet the Reporting SLAs defined by business.
  • Involved in activities related to data cleansing, data quality and data consolidation using industry standards and processes.
  • Identified, developed, and documented data standardization, data enrichment operations, data validations, data security requirements, and data exception handling processes.
  • IDQprofiling on the standardizeddataand verify de-duplication and match results.
  • Providing insight to IT team for setting upMDMworkflow.
  • Involved in creatingHivetables, loading with data and writinghivequeries which will run internally in Map Reduce way.
  • Documented, implemented & maintained models to improve processing, distribution, data flow, collection, database editing procedures.
  • Assisted Business Subject Matter Experts and data stewards to review and “scrub” their data in a timely manner and perform data audits, as necessary.
  • Assisted in creation of document data flow diagrams, entity relationship diagrams and other project deliverables, as requested.
  • Assisted project team with trouble-shooting of “data issues” as they arise during testing cycles.
  • Conducted data modeling sessions with business users and technical staff, which includes conveying complex ideas both verbally and in written form.
  • Defined parameters for file and space utilization & wrote queries related to data retrieval.
  • Worked with business team on reporting and requirements gathering and created data mapping documents for various transformation rules
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Wrote complex SQL queries onNetezzaand used them in lookup SQL overrides and Source Qualifier overrides.
  • Extracted data from various sources like Oracle,Netezza and flat filesand loaded into the targetNetezzadatabase
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Perform data reconciliation between integrated systems.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Involved indatamining, transformation and loading from the source systems to the target system.
  • Supported business areas and database platforms to ensure logical data model and database design, creation, and generation follows enterprise standards, processes, and procedures
  • Generated a variety of metadata artifacts
  • Provided input into database systems optimization for performance efficiency and worked on full lifecycle of data modeling (logical - physical - deployment)
  • Formed numerous Volatile, Global, Set, Multi-Set tables.
  • Used the automated process for uploadingdatain production tables by using UNIX.
  • Performed dicing and slicing ondatausing Pivot tables to acquire the churn rate pattern and prepared reports as required.
  • In depth analyses ofdatareport was prepared weekly, biweekly, monthly using MS Excel, SQL & UNIX.

Environment: PL/SQL, Informatica 9.5, Oracle 11G, Teradata V2R12/R13.10, Netezza, Teradata SQL Assistant 12.0, ERWIN data modeler, ODS

Confidential, Portland, ME

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Designed data migration scripts to move Sales Data warehouse to company Data warehouses.
  • Heavily used Sales and consumer interaction data for Analysis and creating business reports.
  • Extensively involved inOraclePL/SQL programming to create triggers.
  • Worked with the Business Subject Matter Experts to understand and define data management policies including enterprise naming and definition standards, specification of business rules, data quality analysis, standardized calculations and summarization definitions, and retention criteria.
  • Performed logical data modeling, physical data modeling (including reverse engineering) using the Erwin Data Modeling tool.
  • Did data modeling and involeved with design and development of conceptual, logical and physical data models using AllFusion Data Modeler (Erwin)
  • Performed logical data model design including normalization/denormalization referential integrity, data domains; primary and foreign key assignments and data element definitions as applied to both relational and dimensional modeling.
  • Demonstrated experience in the gathering, assimilation, organization, categorization and analysis of business requirements to create corresponding designs of logical and physical data models
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Strong skills includeOracle10g/11g SQL, PL/SQL.
  • Extensive customization of user exits in InformaticaMDMper the specific business needs of the client.
  • SetupHivewith MySQL as a Remote Metastore.
  • Worked on claims, and provider related data and extracted data from various sources such as flat files, Oracle and Mainframes.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Worked on IDQ for Data profiling, cleansing and matching.
  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle.
  • Written several shell scripts using UNIX Korn shell for file transfers, error logging, data archiving, checking the log files and cleanup process.
  • Created dashboards on TIBCO Spotfire and modified existing Dashboards and reports.
  • Performed data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive data validation by writing several complex SQL queries and involved in back-end testing and worked with data quality issues.
  • Worked with end users to gain an understanding of information and core data concepts behind their business.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along with mapping documents to assist the developers in their coding.
  • Worked on all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation, and support using Pentaho Data Integration and SSIS.
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Designed and developed database models for the operational data store, data warehouse, and federated databases to support client enterprise Information Management Strategy.
  • Flexible to work late hours to coordinate with offshore team.

Environment: - MS SQL Server 2012, Oracle 10g, MS office, Data Stage 8.x, Tibco Spotfire, Clear Quest, Clear Case, Teradata R13

Confidential, Rollinsford, NH

Data Warehousing Analyst/Data Modeler

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Perform data reconciliation between integrated systems.
  • Managed, updated and manipulated report orientation and structures with the use of advanced Excel functions including Pivot Tables and V-Lookups.
  • Experience using data cleansing techniques, Excel pivot tables, formulas, and charts.
  • Create Relationship and column discovery in Informatica's Developer Tool, InformaticaAnalystTool (IDQ) and export the results in to TDM.
  • The match and merge process to run without using the InformaticaMDMconsole is configured in backend for thedataloads.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Used Data Profile Viewer and wrote database T-SQL scripts to identify and quantify data defects and anomalies.
  • Extensively used MS Access to pull the data from various data bases and integrate the data.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Responsible for different Data mapping activities from Source systems to Oracle and SQL Server Databases
  • Assisted in the oversight for compliance to the Enterprise Data Standards
  • Worked in importing and cleansing of data from various sources like Oracle, flat files, SQL Server 2005 with high volume data
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Flexible to work late hours to coordinate with offshore team.

Environment: Quality Center 9.2, MS Excel 2007, PL/SQL, Informatica 8.6, Oracle 10G, Netezza, Aginity

Confidential, Birmingham, AL

Data Analyst

Responsibilities:

  • Involved in Data mapping specifications to generate mapping documents containing various transformation rules to be consumed by ETL teams.
  • Involved in compiling, organizing, mining and reporting financial data
  • Involved in development, implementation, and roll-out of dashboards for business objective metrics
  • Created and executed test scripts, scenarios and test plans that validated initial business requirements and desired analytical reporting capabilities
  • Produced data maps, collaborated in designing and validating the logical data model design and prototyping
  • Lead documentation effort for interviews, data warehousing requirements, and application data requirements
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Performed data mining on Claims data using very complex SQL queries and discovered claims pattern.
  • Responsible for different Data mapping activities from Source systems to EDW, ODS & data marts.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
  • Tested several stored procedures and wrote complex SQL syntax using case, having, connect by etc
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Worked with end users to gain an understanding of information and core data concepts behind their business.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along with mapping documents to assist the developers in their coding.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Informatica 8.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files, Teradata V2R6

Confidential, Chevy Chase, MD

Data Analyst

Responsibilities:

  • Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the database in UNIX environment
  • Developed separate test cases for ETL process (Inbound & Outbound) and reporting
  • Involved with Design and Development team to implement the requirements.
  • Developed and Performed execution of Test Scripts manually to verify the expected results
  • Design and development of ETL processes using Informatica ETL tool for dimension and fact file creation
  • Involved in Manual and Automated testing using QTP and Quality Center.
  • Conducted Black Box - Functional, Regression and Data Driven. White box - Unit and Integration Testing (positive and negative scenarios).
  • Defects tracking, review, analyzes and compares results using Quality Center.
  • Participating in the MR/CR review meetings to resolve the issues.
  • Defined the Scope for System and Integration Testing
  • Prepares and submit the summarized audit reports and taking corrective actions
  • Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
  • Document and publish test results, troubleshoot and escalate issues
  • Preparation of various test documents for ETL process in Quality Center.
  • Involved in Test Scheduling and milestones with the dependencies
  • Functionality testing of email notification in ETL job failures, abort or data issue problems.
  • Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
  • Created and executed test cases for ETL jobs to upload master data to repository.
  • Responsible to understand and train others on the enhancements or new features developed
  • Conduct load testing and provide input into capacity planning efforts.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner
  • Create and execute test scripts, cases, and scenarios that will determine optimal system performance according to specifications.
  • Modified the automated scripts from time to time to accommodate the changes/upgrades in the application interface.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting

We'd love your feedback!