We provide IT Staff Augmentation Services!

Etl/ Data Warehouse Developer Resume

5.00/5 (Submit Your Rating)

Nashville, TN

SUMMARY

  • Over 5 years of IT industry experience encompassing a wide range of skill set.
  • Executed software projects for banking and insurance industries.
  • Working experience on Informatica Power center 10.1.0, DataStage 11.x/8.x/ 7.x.
  • Academic experience in implementation of various Big Data regression and classification method using PySpark.
  • Understanding of regression methods like linear, ridge, LASSO, Random Forest, Decision Tree, Gradient boost regression methods.
  • Understanding of concepts of cloud computing, cloud computing architecture - Infrastructure as a Service (IaaS), Platform-as-a-Service (PaaS), Software as a Service (SaaS).
  • Understanding concepts of algorithms and applications of mining text/web data which include entity extraction, social graph analysis, text clustering, TF-IDF indexing, natural language processing, trend analysis and semantic web.
  • Involved in ETL design, development, Production Enhancements, Support and Maintenance.
  • Involved in R Programming to generate graphs and analyze test data as part of academic projects. Good knowledge in statistics.
  • Strong understanding of the principles of Data Warehousing, using fact tables, dimension tables and various schema modeling like snowflake, star schemas.
  • Worked extensively with Dimensional modeling, Data Migration, Data Cleansing, ETL Processes.
  • Developed jobs for extracting, cleansing, transforming, integrating and loading data into Data Warehouse database.
  • Developed DataStage parallel jobs using different processing stages like Transformer, Aggregator, Lookup, join, sort, Copy, Merge, Funnel, and filter.
  • Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts and scheduling tools.
  • Worked on Informatica Power center tool - Source Analyzer, Data warehousing designer, Mapping and Mapplet designer and Transformation designer.
  • Involved in Logical & Physical Database Layout Design.
  • Worked extensively on different types of transformations like Source qualifier, Expression, Filter, Aggregator, Rank, Update Strategy, Lookup, Sequence generator, Joiner and XML.
  • Proficient in developing strategies for Extraction, Transformation and Loading mechanism.
  • Created PL/SQL stored procedures, packages and functions for moving the data from staging area to database.
  • Conducted unit tests using various test cases.
  • Created and modified several UNIX Korn Shell scripts according to the changing needs of the project and client requirements.
  • Worked on scheduling tool Autosys, Active Batch.
  • Interacted with clients to understand the business requirements and review the design documents.
  • Experience in trouble shooting of jobs and addressing production issues like data issues, environment issues, performance tuning and enhancements.
  • Provided training and direction to Team members.
  • Experience in using software configuration management tools like Rational Clear case, GIT for version control.
  • Effective in cross functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.

TECHNICAL SKILLS

Database and Programming Languages: SQL, Teradata SQL, PL SQL, Shell scripting, UNIX

Operating Systems: Windows 10, UNIX, LINUX

Database Client tools: Oracle, DB2, Exadata, Netezza Teradata, Microsoft SQL server Management studio, TOAD

ETL Tools: Informatica Power Center 10.1.0, IBM Web Sphere DataStage 11.x, Ascential DataStage 7.x

Scheduling tools: Autosys, Active Batch Console

Other tools: HP Quality center, Rational clear case, Autosys, R Studio, WinScp, Putty

PROFESSIONAL EXPERIENCE

Confidential, Nashville, TN

ETL/ Data Warehouse Developer

Responsibilities:

  • Involved in complete SDLC including Requirement Specifications, Analysis, Design and development.
  • Creating ETL mappings using Informatica Power center to extract the data from multiple sources like Flat files, Oracle, Xml, CSV, Delimited files transformed based on business requirements and loaded to Data Warehouse.
  • Created, Launched and scheduled sessions.
  • Created XML files for Export and Import between various ETL environments.
  • Worked on PL/SQL packages and stored procedures to implement business rules and validations.
  • Performed the performance and tuning at source, Target levels using Indexes, Hints and Partitioning in ORACLE and Informatica.
  • Developed mappings using mapping designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Sorter and Sequence generator Transformations.
  • Dealt with complex mappings which involved slowly changing dimensions, implementation of business logic and capturing the deleted records in the source system.
  • Involved in troubleshooting of production batch jobs.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
  • Set and follow Informatica best practices, such as creating shared objects in shared for reusability and standard naming convention of ETL objects, design complex Informatica transformations, mapplets, mappings, reusable sessions, worklets and workflows.
  • Involved in the development of mappings, mapplets, Sessions, Workflows and Worklets
  • Assisted other ETL developers in solving complex scenarios and coordinated with source system owners with day to day ETL progress monitoring.
  • Performing unit testing, data validation and testing of jobs using debugger to test the data flow and fix the bugs.

Environment: Informatica Power center 10.1.0, Teradata, Microsoft SQL Server, Active Batch, UNIX Shell Scripting

Confidential

ETL Developer

Responsibilities:

  • Involved in design of dimensional data model - star schema and snow flex Schema.
  • Collaborated with Data Modeler in low level design document for mapping the files from source to target and implementing the business logic.
  • Designed and Developed Informatica jobs to extract data from heterogeneous sources, applied transform logics to extracted data and load into databases.
  • Worked with SCD’s to populate Type I and Type II slowly changing dimension tables from several operational source files.
  • Debug, test and fix the logic applied in the transformations.
  • Worked in migrating the existing scripts from database servers to ETL server and, the Autosys jobs which are used to trigger these scripts have been migrated from R4.5 to R11 server.
  • Loaded various kinds of input files into Netezza database after proper validations.
  • Migrated jobs from DataStage to Informatica Power Center as part of ETL conversion project.
  • Analyzed the existing jobs in DataStage and developed Informatica mappings according to industry standards.
  • Extensively used Expression, Source Qualifier, Joiner, lookup and debugger transformations for development and debug purposes.
  • Extensively used almost all the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy and others.
  • Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution.
  • Documented ETL test plans, test cases and validations based on design specifications for unit testing.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate.
  • Prepared test data for testing, error handling and analysis.

Environment: IBM Web Sphere DataStage 11.0, Informatica Power Center 9.0, Oracle, Netezza, Unix shell scripting, PL/SQL, WinScp, Putty

Confidential

ETL Developer

Responsibilities:

  • Extracted data from sources like Fixed width and Delimited Flat files, transformed the data according the business requirement and then Loaded into Oracle database.
  • Experienced in processing file stages that include Complex Flat file stage, Dataset stage, Lookup file stage, sequential file stage.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
  • Experienced in developing parallel jobs using various Development stages.
  • Modified several of the existing mappings and created several new mappings based on the user requirement.
  • Maintained existing mappings by resolving performance issues.
  • Imported and Created Target Definitions using Warehouse Designer.
  • Written queries and stored procedures in PL-SQL to fetch data from the OLTP Systems.
  • Created reusable transformations and used them in mappings.
  • Designed parallel jobs to stop practicing of generating income through illegal actions. (Anti Money Laundering)
  • Interact with multiple teams to make sure that data is available to downstream systems on time.
  • Fine-tuned Transformations and mappings for better performance.
  • Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, lookup, stored procedure, sequence generator and joiner.
  • Developed shell scripts for automation of session loads.

Environment: IBM Web Sphere DataStage 11.0, Oracle, Unix shell scripting, Autosys, PL/SQL, WinScp, Putty

Confidential

IT Associate

Responsibilities:

  • Created and optimized diverse SQL queries. Great expertise in processing data and flowcharting techniques
  • Implemented using Forms, Reports, PL/SQL and Workflow
  • Understanding of database structures, principles, theories and practices.
  • Monitored the performance of the database and checked on the execution time
  • Used TOAD, SQL Developer to develop and debug procedures and packages.
  • Ensured that the existing data warehouses, reporting systems and various tools were adequately supported.
  • Created SQL module which was used to integrate the existing data from third parties and on to the database

Environment: Oracle 8i, PL/SQL, Forms 4.5, TOAD

We'd love your feedback!