We provide IT Staff Augmentation Services!

Sr. Sql/informatica Engineer Resume

3.00/5 (Submit Your Rating)

Columbus, OH

SUMMARY

  • Overall 8+ years of experience in SDLC (Software Development Life Cycle), analyzing business requirements, design and development, Data Modeling, Data management strategy, Implementation and testing of Data warehousing.
  • Experience with Erwin and strong knowledge of Relational database modeling and Dimensional database modeling concepts and star/snowflake schemas implementation and good knowledge on OLTP and OLAP systems.
  • Extensive understanding of Transactional and Dimensional Data Modeling, Data Ware house concepts and Designing Star schemas & Snowflake Schemas for OLAP Systems.
  • Work experience has given me the opportunity to acquire an in - depth knowledge of the Retail & Healthcare domains, Tax domain, procedures, implementations techniques, system development as well as building customer relationship skills.
  • Expertise in Data transformation and Loading for the design and devlopement of ETL methodology in a corporate wide ETL Solution using Informatica Power Center 9.1/8.6/8.1/7.1 , Informatica Power Exchange 9.1/8.1, Oracle 11g/10g, SQL Server 2x, SQL and PL/SQL.
  • Extensive experience in developing mapping transformations, workflows, worklets, mappings, Mapplets, Sessions, Tasks, schedule the Workflows and Sessions.
  • Good knowledge on OLTP and OLAP Systems, developing Database schema - Star and Snow flake schema (Dimensions and Facts), E-R Modeling and expertise in integrating various source’s- Oracle, Teradata and Flat files.
  • Proficient in Implementation of Data warehouse from Requirement gathering, Data Analysis, data modeling, Application Design, development and data cleansing.
  • Good experience in Teradata using using queries and utilities like BTEQ, MLOAD, FLOAD and TPUMP and fast export.
  • Experience in creating efficient High and low level ETL documents.
  • Worked on Informatica and SQL performance tuning.
  • Worked on Effort Estimation, coordination with various teams on different issues, worked with Clients, Users, Onsite team and offshore team and keep track of work progress.
  • Capable of time management in line with the deadlines, identifying potential project risks, task priorities to meet project timelines.
  • Extensively used SCD (slowly changing Dimension) in insurance application.
  • Good experience in Debugging and performance tuning of sources, targets, mappings, transformations and sessions.
  • Responsible for Data Extraction, data transformation and data loading from multiple sources into the data warehouse.
  • Experience in Task Automation using UNIX Shell Scripts, Job scheduling and communicating with Server using pmcmd, pmrep to schedule and control sessions, batches and repository tasks.
  • Excellent analytical, problem solving and communication skills with ability to interact with individuals at all levels.

TECHNICAL SKILLS

Data Modeling Tool: Erwin 7.1.3/6.0, Oracle Designer

Data Warehouse/ETL Tool: Informatica Power Center 9.1/8.6/7.1 (PC and IDQ), Data Stage 8.7, Pentaho.

Databases: Oracle 10g,9i/8i/7.x, Teradata 15.0, SQL Server, Mysql

Languages: PL/SQL, C, C++, SQL, UNIX shell scripting.

Web Tools: HTML, XML, Java Script.

Tools: TOAD, Pl/Sql Developer, Data Ladder, Tableau

Operating Systems: Microsoft 95/98/2000/XP/NT, UNIX AIX

PROFESSIONAL EXPERIENCE

Confidential - Columbus, OH

Sr. SQL/Informatica Engineer

Responsibilities:

  • Involved in gathering requirements from business users on day to day basis.
  • Worked on analyzing systems data, stored on an enterprise data warehouse (EDW)
  • Developed ad-hoc queries on Teradata system for summarizing business operations and trends.
  • Developed Teradata utilities to load external file data to custom table space for data mining.
  • Created cypher queries for real-time link analysis to establish patterns between data sets.
  • Identified data correlations across vendor data and business data using data matching enterprise tool.
  • Customized data models and tuned SQL queries for better database performance.
  • Worked on Version control software IBM clear case to update statistical models, queries and documentation.
  • Performed multidimensional analysis of business data, interpretation results and generate trend analysis or predictive analysis.
  • Understanding data behavior for real time analytics on IBM DB2 transactional system for statistical analysis.
  • Worked on code optimization, end to end testing with all possible test case scenarios.
  • Part of stand-up meetings, sprint retrospective and planning as part of agile software development methodology.
  • Provided documentation for business requirements, data mapping, and production release.
  • Re-write existing scripts as part of discovery program for business taxes like sales, use etc.,
  • Developed dashboards on Tableau for business users, to identify fraud patterns in tax filings season to release refunds, and pulled production data from IBM DB2 to Teradata database using parallel transporter, worked on BTEQ scripts to push data into teradata tables.

Environment: Teradata 15.0, IBM DB2, Tableau, Data Ladder, Neo4j Cypher, MYSQL, PentahoIBM SPSS statistics tool, IBM clear case, Erwin Data modeler.

Confidential -Chicago, IL

Sr.ETL Developer

Responsibilities:

  • Involved in Plan, Document, Develop and ETL development for the company’s data warehouse.
  • Understanding of the Business Requirements Documents (BRDs), High Level Design Documents and High-Level Test Cases - which are inputs for us to come up with Low Level Design phase deliverables.
  • Developed number of complex mappings using lookups (connected - unconnected), Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, normalizer, sequence generator, Update Strategy, Transactional Control, SQLT Transformations, mapplets to implement the business logic and load the data incrementally.
  • Data loadingfrom EDI source to claim data mart which includes Design, data analysis, development, debugging and testing of all components.
  • Develop and deploy ETL job workflow with reliable error/exception handling and rollback framework.
  • Used TeradataManagerToolsfor monitoring and control the system. Worked with Teradata 12 utilities like BTEQ, Fast Load, and Multi Load and Query man.
  • Updated numerous Bteq/ Sql scripts, making appropriate DDL changes and completed unit and system test.
  • Architected and developedFast ExportandMLoadscripts in controlfile, developed macros and Stored procedure to extract data,BTEQscriptsto take the date range from the database to extract data.
  • Worked with Teradata utilities like BTEQ, Fast Load and Multiplied and Tpump and Teradata Parallel transporter.
  • Used advanced concepts of Informatica like PDO, TPT. Used full PDO on Teradata and worked with different Teradata load operators.
  • Used BTEQ andSQL Assistant(Query man) front-endtools to issue SQL commands matching the business requirements to TeradataRDBMS.
  • The Load Ready File is loaded into Teradata Table using Teradata’s ODBC/Fast, Multi load connection. The Load Date and Load Time are also captured in the Teradata table
  • Developing the SQL scripts in TOAD and creating Oracle Objects like tables, materialized views, views, Indexes, sequences, synonyms and other Oracle Objects.
  • Handled slowly changing dimensions of Type1, Type2 to populate current and historical data to dimensions and fact tables in the Data Warehouse.
  • Involved into Replicate Transaction Data for Heterogeneous and Big Data Systems in Real Time.
  • Debugged mappings by creating logic that assigns a severity level to each error, and sending the error rows to error table so that they can be corrected and re-loaded into a target system.
  • Created/modified various Informatica Mappings and workflows for the successful migration of the data from various source systems to oracle which meets the business requirements for reporting and analysis.
  • Created dynamic PL/SQL procedures to load data from staging area to the data marts.
  • DevelopLogical and Physical DataModel using Erwin, followed Star Schema to build Reporting data mart.
  • UsedPMCMDfor calling Batches and Sessions from the Command mode also embed the same in UNIX shell scripts.
  • Scheduled sessions and batches on Informatica server using Informatica server manager.
  • Migrated the ETL code across the environments (Dev, SIT, UAT and Prod).
  • Documented Mappings, Transformations and Informatica sessions for further maintenance and support.
  • Developing the unit test case document and performed various kinds of testing like unit testing, regression testing and system test in Dev, QA environments before deployment.
  • Played key role in performance improvement activities like query tuning.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Implement Data Quality Rules using IDQ to check correctness of the source files and perform the data cleansing/enrichment
  • Coordinating with the team and interacting with the client for implementation of deliverable.

Environment: Informatica Power Center 9.1(IDQ), Teradata 13, PL/SQL, UNIX.

Confidential - Merrimack, NH

Sr.ETL Developer

Responsibilities:

  • Worked with business analysts to identify appropriate sources for data warehouse and to document business needs for decision support for data.
  • Involved in processing of Policy, Claims data in dev system.
  • Prepared BTEQ scripts to load data from Preserve area to Staging area.
  • Worked with SETand MULTISET tables for performance evaluation of the scripts.
  • Used utilities of FLOAD, MLOAD, FEXP of Teradata and created batch jobs using BTEQ
  • Worked on Teradata SQL Assistant querying the source/target tables to validate the BTEQ scripts.
  • Extensively used Xml source qualifier, Mid-stream XML parser transformation, Mid-stream XML generator transformations, as our main source type was XMLFiles.
  • Imported Metadata from Teradata tables.
  • The Load Ready File is loaded into Teradata Table using Teradata’s ODBC/Fast load connection. The Load Date and Load Time are also captured in the Teradata table
  • Extensively used Dynamic Lookup transformation and Update Strategy transformation to update slowly changing dimensions.
  • Sources likeDTS Packages (connections, tasks, and workflows) that can be used to access, transform and manipulate a wide range of sources including text files and relational databases.
  • Worked on Version Control in Informatica to maintain multiple versions of an object, control development on the object and track changes.
  • A Load Ready File (Flat File) is created from source file (delimited file) after a series of transformations and error handling, this serves as a staging area for cleansing the data. The error file is captured in delimited file.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
  • Devised extensive PL/SQL Stored Procedures and Triggers to populate the base tables on daily basis and implementing business rules and transformations
  • Involved in Optimizing and Performance tuning logic on targets, sources, mappings, and sessions to increase the efficiency of session and Scheduled Workflows using Autosys.
  • Implemented complex mapping such as Slowly Changing Dimensions (Type II) using Flag.
  • Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from and to different servers.
  • Defined the Business ObjectsClasses and Objects, Created complex Universes and Reports.
  • Created the reports using Business Objects functionalities like Queries, Slice and Dice, Drill Down, MasterDetail and Formula’s etc.
  • Written SQL overrides in source qualifier according to the business requirements.
  • Extract the data from truecomp Repository, which is normalized schema, and load into truecomp Data Mart schema, which is star-flake schema. Actuate reports will pull the data from this Data Mart schema.
  • Performance tuning of Business Objects Reports which take more execution time.
  • Done extensive testing and wrote SQL queries to ensure loading of data, performed unit testing at various levels of ETL, developed and implemented the coding of Informatica mapping.
  • Performeddatabasechecks and tuned the databases using Teradata Manager.
  • Involved in troubleshooting the loading failure cases, including database problems.

Environment: Informatica Power Center, 8.6.1, XMLFiles, ERwin 4.0/3.5, Oracle 11g, DB2, Tera Data, PL/SQL, Business objects, TOAD, UNIX AIX.

Confidential - Dallas, TX

Informatica Developer

Responsibilities:

  • Worked with business analyststo analyze the functional requirementsparsinghigh-level design specto simple ETL coding and mapping standards.
  • Designedcomplex mappingsinvolvingtarget load orderandconstraint based loading.
  • Uploaded data from operational source system (Oracle 9i) to Teradata.
  • Worked with Teradata utilities like BTEQ, Fast Load and MultiLoad and Tpump and Teradata Parallel transporter.
  • Wrote UNIX shell scripts to work with flat files, to create pre-and post-session commands and scheduling workflows.
  • Worked with DBAs and Data Architects in planning and implementing appropriate data partitioning strategy for Enterprise Data Warehouse.
  • Extracted and loaded data using Power Exchange for Netezza tables.
  • Developed stored procedures, database triggers and SQL queries where needed.
  • Designed the ETL processes using Informatica to load data from DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Oracle Data Warehouse database.
  • Extensively used Xml source qualifier, Mid-stream XML parser transformation, Mid-stream XML generator transformations, as our main source type was XMLFiles.
  • Extensively used Dynamic Lookup transformation and Update Strategy transformation to update slowly changing dimensions.
  • Creation of data mart and extracted data from Oracle, using Informatica Power Centre
  • The Load Ready File is loaded into Teradata Table using Teradata utilities like Fast loadMultiload, Bteq, and Tpump connections. The Load Date and Load Time are also captured in the Teradata table.
  • Architected and developedFast LoadandMLoadscripts in control file,developedBTEQ scriptsto process the data in staging server.
  • Written SQL overrides in source qualifier according to the business requirements.
  • Used workflow manager for creating, validating, testing and running the Sequential and Concurrent batches and sessions and scheduling them to run at specified time with required frequency.
  • Created and used different tasks like Decision, Event Wait, Event Raise, Timer and E-mail etc.
  • Using Metadatamanger to find out the relations and load different tools like Informatica, DB2, Oracle.
  • Responsible for Error Handling and bug fixing.

Environment: Power Center 8.6.1, Power Exchange 8.6.1, Metadata Manager 8.1.1, Teradata 13.0Oracle 11g, Mainframe DB2, XML, Netezza.

Confidential - Cincinnati, OH

Informatica Developer

Responsibilities:

  • Understanding the existing business model and requirements, involved in requirements gathering and understanding business needs.
  • Responsible for the code build, Unit testing and co-ordinate with QA team during QA and UAT testing.
  • Used several transformations like Source Qualifier, Expression, Lookup, Aggregator, Joiner, Sequence Generator, Filter, Router, Java, SQL, Stored Procedure and Update Strategy transformations for complex mappings.
  • Created informatica workflows, Worklets and re-usable sessions.
  • Parsinghigh-level design specto simple ETL coding and mapping standards.
  • Developed mappings with PDO Optimization to leverage scalable Teradata Architecture.
  • Extensively used Teradata SQL Assist for SQL Queries, Collecting Stats, Query Band testing etc.
  • Developed & configured Control-M job groups and jobs for processing various Informatica workflows as per business process flow.
  • Actively participated in the code migration process to QA, UAT and PROD.
  • Have participated in exporting the sources, targets, mappings, workflows, tasks etc. And imported into the new Informatica 8.X, tested, reviewed to make sure that all the workflows are Execute as per the Design documents and Tech specs.
  • Extensively used XML Source, XML target and XML Parser transformations
  • Developing Mapplets and Transformations for migration of data from existing systems to the new system using Informatica Designer.
  • Developed stored procedures and functions in PL/SQL using Oracle SQL Developer.
  • Created complex Mappings which involved Slowly Changing Dimensions to capture the historical records in the source systems.
  • UsingWorkflow Manager,for creating, Validating, Testing and running the Batches and Sessions and scheduling them to run at specified time.
  • Proactively interacting with clients in understanding the requirements and fixing the issues and interacting with off-shore people and Database team if any issues.

Environment: Power Center 8.1, Power Exchange 8.1.1, Teradata 13.0, Oracle 10g/9i, XMLNetezza and PL/SQL.

Confidential

Informatica Developer

Responsibilities:

  • Analyzed source systems data, business requirements and applied business rules for building a data warehouse, prepared technical Documentation of existing and proposed models.
  • Created necessary Repositories to handle the metadata in the ETL process
  • Created the Source and Target Definitions in Informatica Power Center Designer, created mappings, sessions and workflows for populating the data into the dimension and fact tables from different source systems.
  • Extracted data from various Source IT systems such as Oracle, SQL Server, Files (Products, Customers, Rates and Actuals etc.)
  • Created Mapplets for loading reusable control table, configured tasks, and workflows using workflow manager.
  • Running the SQL scripts from TOAD and created Oracle Objects like tables, Views, Indexes, sequences and Users. SQL code is written for problem resolution and performance optimization
  • Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using Autosys.
  • Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor.
  • Used mapping Parameters and Variables to pass the values between sessions.
  • Used Lookup, Aggregator, Joiner, Union, Expression, Filter, Update strategy, Sql and Normalizer, Router, Source Qualifier transformations to model various standardized business processes and implement business logic using Informatica.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow
  • Used Star Schema approach for designing of Data Warehouse using Erwin.
  • Worked with Memory cache for improving throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • As per the requirement, performed Initial loading, monthly loading and monthly cleanup update processing.
  • Used SQL Override in Source qualifier to customize SQL and filter data according to requirement.
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and the Target Level for Slowly Changing Dimensions Type1, Type2, and Type3 for Data Loads.
  • Prepared Unit Test Plan (UTP) documents for all the test cases for the developed mappings, unit Testing, Integration Testing and System Testing.

Environment: Oracle 10g, Informatica Power Centre 7.1.3(Informatica Designer, Repository Manager, Workflow manager, Workflow monitor), UNIX, Tools: Toad, MS-Visio.

We'd love your feedback!