We provide IT Staff Augmentation Services!

Sr. Data Analyst/data Modeler Resume

0/5 (Submit Your Rating)

New York, NY

SUMMARY

  • 9+ years of Industry experience as a Data Analyst with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart mapping, data analysis, data profiling, data quality concepts.
  • Expert in writing SQL queries and optimizing the queries in Oracle, Netezza and Teradata V2R6/R12/R13.
  • Extensive experience with SQL and SQL scripting for relational databases such as Oracle, Microsoft SQL Server, IBM DB2, MySQL.
  • Good experience with ETL tools such as Informatica, Data Stage
  • Experience with Business Objects, TIBCO Spotfire & Microstrategy
  • Excellent Experience with the design of large scale ETL solutions integrating multiple source systems
  • Strong understanding of KPI's, data relationships, reporting schemas, data transformations
  • Solid Experience in data analysis and problem solving with large amounts of data
  • Solid Experience in data architecture, data warehousing, master data management, enterprise information integration and ETL.
  • Excellent experience with data analysis, modeling and design specific to a data warehouse
  • Excellent experience using different development methodologies including SDLC, Scrum, and/or Agile, and test - driven development
  • Ability to mentor others and provide technical direction on data architecture, reporting, warehousing and OLAP design.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle, Netezza and Teradata V2R6/R12/R13.
  • Experience in Data masking to load masked data to test & dev environments.
  • Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch, Data quality concepts
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export
  • Solid experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders
  • Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
  • Excellent experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.
  • Experience in testing Business Intelligence reports generated by various BI Tools like Spotfire and Business Objects
  • Excellent knowledge on creating DML statements for underlying statements.
  • Extensive ETL testing experience using Informatica 9x/8x (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager)
  • Have good exposure on working in offshore/onsite model with ability to understand and/or create functional requirements working with client and also have Good experience in requirement analysis and generating test artifacts from requirements docs.
  • Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
  • An excellent team player & technically strong person who has capability to work with business users, project managers, team leads, architects and peers, thus maintaining healthy environment in the project.

TECHNICAL SKILLS

Data Warehousing: Informatica 9.5/9.1/8.6/7.1.2 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Ab-Initio, Talend

Reporting Tools: Business Objects6.5, XIR2, Tableau, TIBCO Spotfire

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

RDBMS: Netezza Twin fin, Teradata R14, R13, R12, Oracle 11g/10g/9i/8i/7.x

Programming: SQL, PL/SQL, UNIX Shell Scripting, VB Script

Environment: Windows (95, 98, 2000, NT, XP), UNIX

Other Tools: TOAD, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata V2R6/R12/R13/R14 SQL Assistant, Aginity

PROFESSIONAL EXPERIENCE

Confidential, New York, NY

Sr. Data Analyst/Data Modeler

Responsibilities:

  • Worked with Data Stewards and subject matter experts to research reported data anomalies, identified root causes, and determined appropriate solutions.
  • Worked with data source systems and Client systems to identify data issues, data gaps, identified and recommended solutions.
  • Worked on creating SQL queries and performance tuning of queries
  • Worked closely with Architects and developers to deliver the database components for large scale, complex web based applications.
  • Developed and maintained stored procedures. User access maintenance. Implemented changes to database design including tables and views.
  • Assisted developers and customers with ad hoc retrieval of information
  • Worked on data manipulation and analysis & accessed raw data in varied formats with different methods and analyzing and processing data
  • Performed data modeling and data analysis as required.
  • Acted as liaison between the business units, technology teams and support teams.
  • Responsible for researching data quality issues (inaccuracies in data), worked with business owners/stakeholders to assess business and risk impact, provided solution to business owners
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Extracted data from various sources like Oracle,Mainframes, flat filesand loaded into the targetNetezzadatabase
  • Worked on data warehousing, ETL, SQL, scripting and big data (MPP + Hadoop).
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Perform data reconciliation between integrated systems.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Involved indatamining, transformation and loading from the source systems to the target system.
  • Supported business areas and database platforms to ensure logical data model and database design, creation, and generation follows enterprise standards, processes, and procedures
  • Designed database solution for applications, including all required database design components and artifacts
  • Provided input into database systems optimization for performance efficiency and worked on full lifecycle of data modeling (logical - physical - deployment)
  • Involved with datacleansing/scrubbing and validation.
  • Used Netezza Analytical functions like RANK, ROW NUMBER etc.
  • Performed dicing and slicing ondatausing Pivot tables to acquire the churn rate pattern and prepared reports as required.
  • Worked on TIBCO Spotfire to create various analytics, predictive analytics reports & mobile metrics.
  • In depth analyses ofdatareport was prepared weekly, biweekly, monthly using MS Excel, SQL & UNIX.

Environment: PL/SQL, Informatica 9.5, Oracle 11G, Netezza, Aginity, ERWIN data modeler, UNIX, TIBCO Spotfire

Confidential, Pittsburgh, PA

Sr. Data Analyst / Data Modeler/Data Architect

Responsibilities:

  • Used and supported database applications and tools for extraction, transformation and analysis of raw data
  • Understood business processes, data entities, data producers, and data dependencies
  • Developed, managed and validated existing data models including logical and physical models of the data warehouse and source systems utilizing a 3NF model
  • Developed and programmed test scripts to identify and manage data inconsistencies and testing of ETL processes
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Developed working documents to support findings and assign specific tasks
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Netezza.
  • Written several shell scripts using UNIX Korn shell for file transfers, error logging, data archiving, checking the log files and cleanup process.
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Worked with end users to gain an understanding of information and core data concepts behind their business.
  • Assisted in defining business requirements for the IT team and created BRD and functional specifications documents along with mapping documents to assist the developers in their coding.
  • Flexible to work late hours to coordinate with offshore team.

Environment: - Oracle 10g, MS office, Business Objects, Clear Quest, Clear Case, Netezza

Confidential, Warrenville, IL

Sr. Data Analyst

Responsibilities:

  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Developed working documents to support findings and assign specific tasks
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Performed data mining on Claims data using very complex SQL queries and discovered claims pattern.
  • Created DML code and statements for underlying & impacting databases.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Perform data reconciliation between integrated systems.
  • Metrics reporting, data mining and trends in helpdesk environment using Access
  • Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2
  • Extensively used MS Access to pull the data from various data bases and integrate the data.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Responsible for different Data mapping activities from Source systems to Teradata
  • Assisted in the oversight for compliance to the Enterprise Data Standards
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data
  • Worked with Excel Pivot tables.
  • Developed regression test scripts for the application and Involved in metrics gathering, analysis and reporting to concerned team and Tested the testing programs
  • Identify & record defects with required information for issue to be reproduced by development team.
  • Flexible to work late hours to coordinate with offshore team.

Environment: Quality Center 9.2, MS Excel 2007, PL/SQL, Business Objects XIR2, ETL Tools Informatica 8.6, Oracle 10G, Teradaata

Confidential, Springfield, IL

Data Analyst/Data Quality Expert

Responsibilities:

  • Designed and developed exception handling and data cleansing / standardization procedures
  • Evaluated and improved data quality throughout source systems by implementing data quality safeguards
  • Lead discussions with business users and source system owners to derive business and data rules to be used in data quality assessments.
  • Determined the quality and integrity of data required to meet business needs of the end users and worked towards resolving the most critical data quality issues.
  • Investigated, analyzed documented and assisted in resolution of data quality issues reported by Business Users and I.S. personnel.
  • Provided data flow diagrams and documenting processes
  • Involved in Data mapping specifications to create and execute detailed system test plans.
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Responsible for different Data mapping activities from Source systems to EDW, ODS & data marts.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, and Relational Data from various heterogeneous data sources.
  • Delivered file in various file formatting system (ex. Excel file, Tab delimited text, Coma separated text, Pipe delimited text etc.)
  • Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed
  • Responsible for creating test cases to make sure the data originating from source is making into target properly in the right format.
  • Tested several stored procedures and wrote complex SQL syntax using case, having, connect by etc
  • Involved in Oracle SQL Development, Unit testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Informatica 8.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files

Confidential

Data Analyst / Modeler

Responsibilities:

  • Identified and quantified data variances, maintained process documentation, formulated and proposed quality improvement strategy to ensure data integrity.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Used SQL for Querying the database in UNIX environment
  • Involved with Design and Development team to implement the requirements.
  • Worked as primary point-of-contact for problem escalation and resolution for data management functions
  • Recognized data patterns and took it to the next level of working with IT team for any bug fixes and cleanup to ensure the best customer experience
  • Conducted verification of data prior and post development code release of a four week release cycle.
  • Prepared specifications for development code changes after research for data cleanup scenarios.
  • Performed data mining and manipulation and assisted with compiling historical service information for ad hoc projects and presentations.
  • Identified emerging trends to analyze the impact on information retrieval and reporting, to determine standards for report generation, maintenance, and distribution.
  • Produced Mapping Documents, Validation Queries.
  • Co-ordinated with internal data analysis and data modeling teams external ETL, QC, Architecture and Production Support teams
  • Worked on data quality, data profiling, data warehousing, data modelling and ETL scripts
  • Developed, prepared, and generated ongoing and ad-hoc special reports pertaining to contract compliance issues.
  • Responded to requests for information from the database in a timely and professional manner.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting

We'd love your feedback!