We provide IT Staff Augmentation Services!

Sr. Data Specialist/ Analyst/ Modeler Resume

Baltimore, MD

SUMMARY

  • Overall 8 years of IT experience in the field of Business/Data Analysis, Process Improvement, Business Intelligence, Data Integration, Data Profiling, Data Migration, Data Integration, and Metadata Management Services.
  • Strong understanding of Data Modelling concepts, Star & Snow - Flake Schemas, FACT and Dimensions Tables, Physical and Logical Data Models.
  • Ability to collaborate with peers, in both business and technical areas, to deliver optimal business process solutions in line with corporate priorities.
  • Software Development Life Cycle (SDLC) experience including Requirements, Specifications Analysis/Design, and Testing.
  • Experience in understanding Stored Procedures, Stored Functions, Database Triggers, and Packages using PL/SQL and worked on various domains like Banking, Finance, Insurance and Health Care.
  • Strong experience on gathering business requirements from business/user, source to target mapping, creating source to target mapping document, creating process flows, and DFD.
  • Proficient experience as a Data Analyst in gathering data from different sources, data profiling, data definition, and loaded the data to business warehouse.
  • Experience in conducting Joint Application Development (JAD) sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project's exposure to the forces of change.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools such as Informatica Power Center and Data Stage.
  • Worked with heterogeneous relational databases such as Teradata, Oracle, MS Access and SQL Server.
  • Experience in Other Utilities/Tools/Scripts like Korn Shell Scripting, SQL Plus, SQL Loader, Export and Import utilities, TOAD and Visio.
  • Technical expertise in ETL methodologies, Informatica 6.x/7.x/8.x/9.x - Power Center, Power Mart, Client tools- Mapping Designer, Workflow Manager/Monitor and Server tools- Informatica Server Manager, Repository Server Manager, and Power Exchange.
  • Solid understanding of adherence to Six Sigma, Agile, Waterfall, RUP, Spiral methodology and good Data Warehousing concepts including Metadata and Data Marts.
  • Experience in conducting Joint Application Development (JAD) sessions with end-users, Subject Matter Experts (SME's), Development and QA teams
  • Extensive experience in project management best practices, processes, & methodologies, including Rational Unified Process (RUP) and SDLC.
  • Proficient in executing Unit Test Plans UTP, Integrated Test Plans ITP, extensive experience in User Acceptance Testing UAT, developing Test Scripts, Test Cases and Test Plans.
  • Expertise in defining scope of projects based on business requirements, including documentation of constraints, assumptions, business impacts & project risks.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades and analysis and review of software and business requirement documents.
  • Skilled at interviewing Subject Matter Experts by asking detailed questions and carefully recording the requirements in a format that can be reviewed and understood by both business and technical people.
  • Designed and developed Informatica Mappings and Sessions, Workflows based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Worked on Extract, Transform and Load (ETL) data from spreadsheets, flat files, database tables and other sources using SQL Server Integration Services (SSIS) and SQL Server Reporting Service (SSRS).
  • Highly experienced in using Informatica Power Center and Informatica Data Quality (IDQ) workbench for coding the business rules related to enterprise data elements.
  • Extracted and integrated data from various databases (Oracle and SQL Server, FLAT FILES) using Informatica to a single data warehouse repository.
  • Well versed with Data Migration, Data Conversions, Data Extraction/ Transformation/Loading (ETL) using DTS, PL/SQL Scripts.
  • Expertise in source to target mapping in Enterprise and Corporate Data Warehouse environments.
  • Implemented Slowly Changing Dimensions Type 1, Type 2, and Type 3 methodology for accessing the full history of accounts and transaction information.
  • Experienced to work as a Data Analyst to perform complex Data Profiling, Data definition, Data Mining, validating and analyzing data and presenting reports and ability to handle and interact with Business users, collect and analyze business / functional requirements.
  • Excellent communication, documentation and presentation skills with clear understanding of business process flow.

TECHNICAL SKILLS

ETL/Modeling: Informatica, Informatica Data Quality (IDQ), Informatica Master Data Management (MDM), IBM InfoSphere RDM and MDM, SQL Server Integration Services (SSIS), Data Stage, Erwin, Power Designer, Hyperion HIS, MS Visio

BI tools: Qlikview, Tableau, MSBI (SSIS, SSAS, SSRS), Data Cleaning, Data Blending, ETL, Data Wrangling, Data Mining, Data A-B Testing, Database design

RDBMS: Oracle, Teradata, SQL Server, Informix, Sybase, MS Access

Query Languages: Oracle SQL Developer, Quest Toad, SQL Navigator, SQL Plus, Oracle Discover, SQL Management Studio, Query Man, and Embarcadero

OLAP: Essbase, Cognos, Brio Insight and Designer, Camshare Prism and One-Up

OS: Unix, SUSE Linux, Shell scripts, VI, Wang VS, IBM MVS, JCL, WinSCP

Job Scheduler: Maestro, Tivoli, AutoSys, UNIX scripts

Programming: PL/SQL, MS Basic, SQL

Office Tools: MS Project, MS PowerPoint, MS Excel, MS Word, MS Visio, MS Outlook, Lotus Notes

PROFESSIONAL EXPERIENCE

Sr. Data Specialist/ Analyst/ Modeler

Confidential - Baltimore, MD

Responsibilities:

  • Leads in the selection, deployment and maintenance of data management and reporting systems including operational data stores, data marts, enterprise data warehouses, and enterprise reporting/analytics tools. Builds BI dashboards, data models, structures, and database views, etc. to serve as a foundation for reporting and systems integration goals/strategies.
  • Advised/lead project involving the ETL related activities and the migration or conversion of data between enterprise data systems. Coordinates interactions between central IT, business units, and data stewards to achieve desired organizational outcomes.
  • Gathered and analyzed existing physical data models for in scope applications and proposed the changes to the data models according to the requirements.
  • Advised on and enforces data governance to improve the quality/integrity of data and oversight on the collection and management of operational data.
  • Involved in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Abilities and Informatica Power Center and testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages
  • Gathered and analyzed business data requirements and model these needs. In doing so, worked closely with the users of the information, the application developers and architects, to ensure the information models are capable of meeting their needs.
  • Extensively used Erwin for Data modeling and created Staging and Target Models for the Enterprise Data Warehouse.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Developed SAS programs using SAS/BASE, SAS/SQL, SAS/STAT, and SAS/MACROS for descriptive and inferential statistical analysis and data displays.
  • Maintained data governance, master data management, and source to target mappings in Colibra.
  • Creation of BTEQ, Fast export, MultiLoad, TPump, Fast load scripts for extracting data from various production systems
  • Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
  • Implementation of Metadata Repository, Maintaining Data Quality, Data Cleanup procedures, Transformations, Data Standards, Data governance program, Scripts, Stored Procedures, triggers and execution of test plans.
  • Performed daily tasks including backup and restore by using SQL Server tools like SQL Server Management Studio, SQL Server Profiler, SQL Server Agent, and Database Engine Tuning Advisor and Statements for Applications by using T-SQL.
  • Involved in Capacity Planning, Database Normalization and De-normalization process.
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and identified the Facts and Dimensions from the business requirements and developed the logical and physical models using Erwin.
  • Extensively involved in source system analysis, source to target mapping, data parsing and profiling.
  • Built a highly scalable Data Warehouse Data model. Build complex ETL procedures to feed the data to warehouse applications and feed the data to Variance engine. Informatica Transformations are designed to extract the data and load into Oracle server DB. Once the data is loaded in the Analytical server complex rollup is performed.
  • Trouble shoot overlap and data quality issues by looking through the office of existing policy administration's data lineage, systems, and triggers for signs of bugs and malfunction.

Data/ Business Intelligence Analyst

Confidential - Columbus, OH

Responsibilities:

  • Worked on Minimize losses within the delinquent collections segments by providing specific collection treatments.
  • Involved in data Integration tool Pentaho for designing ETL jobs in the process of building Data warehouses and Data Marts.
  • Created pivot tables and ran VLOOKUP's in Excel as a part of data validation and management reporting design and automation in Access/Excel with dashboards, pivot tables, charts (standard/pivot) and linked data refreshes.
  • Implemented ETL process using Teradata Utilities like BTEQ and Ab Initio and performed the Unit testing and prepared the Test Cases.
  • Used Data visualization and reporting tools like QlikView and Micro Strategy and moved data set across platforms (from PC and Mainframe to UNIX and Vice Versa).
  • Formulated and executed analytics strategies for properties, analyzed and reported latest industry trends for business implications Produce reports to support analysis of loans status including Performing, Delinquent, and Foreclosures.
  • Worked on the single-family project and data is extracted from the Teradata and development of macros to provide transformation variables from a single input variable.
  • Compared the source data with historical data to get some statistical analysis and performed statistical analyses and QC statistical output.
  • Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design and translated the user inputs into ETL design docs.
  • Defined, and documented the technical architecture of the Data Warehouse, including the physical components and their functionality.
  • Created Star schema dimension modelling for the data mart using Visio and created dimension and fact tables based on the business requirement.
  • Designed ETL architecture to process large number of files and created High-level design, low-level design documents.
  • Extensively worked on flat files, mainframe files and involved in creation of UNIX shell scripts using different shell scripts for FTP, generating list files for loading multiple files and in archiving the files after the completion of loads.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Worked on loading data into Data Warehouse/Data Marts using Informatica, Teradata Multiload, Fast load and BTEQ utilities.
  • Extensively worked on indexes in Teradata and Creation of proper Primary Index (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Worked with DBA in making enhancements to physical DB schema. Also, coordinated with DBA in creating and managing table, indexes, table spaces, triggers, DB links and privileges.
  • Possessed expertise with relational database concepts, stored procedures, functions, triggers and scalability analysis. Optimized the SQL and PL/SQL queries by using different tuning techniques like using hints, parallel processing and optimization rules.
  • Responsible for SQL tuning and optimization using Analyze, Explain Plan and optimizer hints and maintained and tuned Teradata Production and Development systems.
  • Extensively worked on maintaining the up to date statistics of the tables by using explain plans. Extensively worked in Skew Factor, resource usage of the queries using query log data.
  • Supported application development timelines by implementing designs as well as incremental changes to database definition in a timely manner into production and non-production Teradata systems.
  • Performed tuning and optimization of database configuration and application SQL.
  • Loaded data by using the Teradata loader connection, writing Teradata utilities scripts (Fast load, Multiload) and working with loader logs. Scheduled workflows, BTEQ Scripts, UNIX shell scripts using WLM Scheduling tool.
  • Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.
  • Responsible for migrations of the code from Development environment to QA and QA to Production.
  • Documented the existing mappings as per the design standards followed in the project and prepared the validation scripts to compare the new data with legacy systems.
  • Carry out Defect Analysis and fixing of bugs raised by the Users and highly motivated with the ability to work effectively in teams as well as independently.

Data Analyst

Confidential - Dublin, OH

Responsibilities:

  • Developed ETLs using Informatica PowerCenter and transformations like Filter, Lookup and Aggregator.
  • Performed SQL queries on data to develop marketing strategies leading to substantial increase in revenue.
  • Maintained constant touch with stakeholders and sustained communications amongst project departments.
  • Provided expertise in strategic planning, business process modeling, business process analysis, object-oriented analysis & design, use case modeling, use case analysis, component-based development, and quality assurance.
  • Gathered Business Requirements, analyzed data/workflows, and defined the scope.
  • Involved in Dimensional modeling to Design and developed STAR Schemas, used ER-win, identifying Fact and Dimension Tables.
  • Analyzed existing transactional database schemas and designed micro and macro level design models.
  • Created and maintained the Requirement Traceability Matrix between the requirements and other products such as design documents and test plans.
  • Tuned Teradata SQL Queries and accomplished data migration process that load data from databases, files into Teradata by the development of shell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD.
  • Developed Complex Mappings using Transformations (Filter, Router, Lookup, Update Strategy, and Expression) on the extracted data according to the Business Rules and Technical Specifications.
  • Created Transformations like Sequence generator, Lookup, joiner and Update Strategy transformations in Informatica Designer.
  • Used Informatica Repository Manager to maintain all the repositories of various applications.
  • Developed Informatica Mappings and Mapplets to load data using various Power Center transformations.
  • Created transformations starting concurrent batch process in server and performed backup, recovery and tuning of sessions.
  • Worked on SQL tools like TOAD, SQL*Loader to run and load SQL queries and validate the data.
  • Developed Unix/Linux Shell scripts which were used in post session command in Informatica Workflows and also for scheduling the workflows, defining the parameter files and for managing the test cases across development and testing environments.
  • Involved in performing installations, development & maintenance of Informatica services in Unix/Linux environments.
  • Developed Test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.
  • Documented client business needs and assisted to prepare project schedules and deliverables and worked with operations to resolve any data discrepancy related to user input or user error.
  • Managed scheduled production of client-facing and quarterly business reviews for various clients.
  • Delivered timely and accurate manufacturer reporting and data exchange with stakeholders and supervised scheduled production and quality assessment processes using Six Sigma tools.

Data Analyst

Confidential - Long Beach, CA

Responsibilities:

  • Worked with data source systems and Client systems to identify data issues, data gaps, identified and recommended solutions.
  • Performed data analysis and data profiling using complex SQL on various sources systems and wrote SQL queries on the databases, wrote test validation scripts and performed system testing.
  • Wrote SQL scripts to test the mappings and developed traceability matrix of business requirements mapped to test scripts to ensure any change control in requirements leads to test cases.
  • Gathered all the Sales Analysis report prototypes from the business analysts belonging to different Business units
  • Responsible with ETL design (identifying the source systems, designing source to target relationships, data cleansing, data quality, creating source specifications, ETL design documents, ETL development (following Velocity best practices).
  • Used External Tables to Transform and load data from Legacy systems into Target tables and use of data transformation tools such as DTS, SSIS, Informatica or DataStage.
  • Monthly Summary data marts, Inventory data marts using Erwin (with various Dimensions Like Time, Services, Customers, Sales Hierarchy, Orders Snow Flake Dimensions and various FACT Tables) and created, defined, and maintained data life-cycle documentation representing data elements across multiple upstream and downstream systems.
  • Proactively communicated and collaborated with external and internal customers to analyze information needs and acted as liaison between the business units, technology teams and support teams.
  • Responsible for researching data quality issues (inaccuracies in data), worked with business owners/stakeholders to assess business and risk impact, provided solution to business owners.
  • Analyzed functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Created and executed automated tests for functional, and regression testing using Quick Test Professional using VB scripts.
  • Executed SQL queries for validation of data and used TOAD for SQL Server to write SQL queries for validating constraints, indexes.
  • Used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Performed dicing and slicing on data using Pivot tables to acquire the churn rate pattern and prepared reports as required.

Hire Now