We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

5.00/5 (Submit Your Rating)

Birmingham, AlabamA

SUMMARY

  • AWS certified ETL developer with over 6+ years of professional experience in the IT industry as ETL developer in analysis, design, development and implementation of data warehouses using technologies like Informatica Power Center 10.x/9.6/9.1/8.6, Oracle 10g/9i, MS SQL server 2012, DB2, MySQL and UNIX shell scripting.
  • Strong experience in using Informatica Power Center client tools (Repository Manager, Mapping Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server databases to develop ETL methodologies.
  • Proficient in using various Informatica transformations like Filter, Expression, Rank, Sorter, Router, Aggregator, Joiner, Lookup, Sequence Generator and Update strategy to design and develop mappings as per the business requirements.
  • Sound experience over documentation process which involves translating the user requirements, writing system specifications and ETL source - to-target mapping documents.
  • Proficient in using ERWIN tool for the use of data modelling, Star and Snowflake schemas dimensional data marts.
  • Expertise in implementing complex business rules by creating reusable transformations, mappings & mapplets.
  • Sound expertise in writing, testing and implementing composite SQL, PL/SQL queries which contains unions, multiple table joins, views, stores procedures, functions and triggers.
  • Experience in tuning the performance of mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Good experience over wide range of areas in data warehousing projects such as system analysis, migration, deployment, quality assurance testing (QA), user acceptance Testing (UAT), production support, maintenance and change control process.
  • File validations and job scheduling on wide range of Unix flavors have been developed using UNIX shell scripts.
  • Good command over parallel and partition techniques, Fast load, Multiload, Fast export and eventually running BTEQ scripts for loading the data into Teradata.
  • Experience in designing & creating Hive tables to upload data in Hadoop and processes like merging, sorting, creating and joining tables.
  • Experience in detecting on-going issues and bug fixes, observing Informatica sessions and executing tuning of mappings.
  • Experience being part of all phases in Software life cycle using SDLC and Agile project methodology.
  • Good soft skills attained by gathering the data from different sources such as flat files, XML files, sequential files, etc. and also use operational sources like Teradata, Oracle, SQL, server, IBM DB2, to load the enterprise data into the warehouse.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.x/9.6/9.1/8.6, Informatica Power Exchange.

Operating Systems: IBM AIX 4.3 and Windows 10/XP/NT.

Languages: PL/SQL, C/C++, UNIX (AIX, HP-UX) Shell scripting, Java.

Databases: Teradata 13.4, Oracle 10g/9i/8i, Microsoft SQL Server 2012, UDB/DB2 on AIX.

Data Modeling: Dimensional Data Modeling, Star and Snow Flake Schemas, ERWIN.

Other Skills: Cyber fusion, SQL Assistant, Pattern Action Language (PAL), Perl scripting, Control-M 8.0.0, Autosys, SAP MDM 5.5/7.1, Microsoft Visio, Management console, Troubleshooting issues, Data Issue resolution, Batch monitoring.

PROFESSIONAL EXPERIENCE

Sr. Informatica Developer

Confidential - Birmingham, Alabama

Responsibilities:

  • Designed, developed and implemented complex logics while developing mappings by using transformations like lookup, expression, update, sequence generator, aggregators, routers, stored procedures.
  • Experience working in data entries like initial, incremental and daily loads to make sure that data is been loaded in tabular format.
  • Performed developments in the team by improvising the system from manual to automations and gained experience with data profiling and data modeling.
  • Strong experience in developing codes in SQL to avert the use with bunch of codes for different sources.
  • Expert in preparing technical specifications to input the data which leads to tremendous development.
  • Extracted data from various sources like Teradata, Mainframe files, Flat Files and loaded to target databases and files.
  • Created a process to pull the data from existing applications and land the data on Hadoop.
  • Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin.
  • Developed mapplets, reusable transformations, source and target definitions, mappings using Informatica.
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica sessions, batches and target data.
  • Responsible for upgrading Informatica 9.1 to 9.5 and validating objects in new versions of Informatica.
  • Major tasks such as identifying, prioritizing & resolving several data issue plans and design and development Transformation logic while taking a stand with the clients.
  • Developed technologies which create Jobs in an organized method, automated email using sequencer to give a heads-up to the operation team of various load issues like job failure, rejected and dropped rows.

Environment: Informatica Power Center 9.6.1/9.5.1, Teradata 13.4, DB2, Oracle 11g, AIX/UNIX, Hadoop, Flat files, Erwin, TOAD, SQL plus, SQL Assistant, WinSCP.

Informatica Developer

Confidential - Birmingham, Alabama

Responsibilities:

  • Strong experience in business intelligence projects with the architecture of ETL process, expertizing the design and implementing of the technical specifications into Informatica mappings to build data warehouse.
  • Worked with data modelling and functional teams which enabled me to identify and extract the data from enormous sources.
  • Gained good analytical and developing skills when working with functional experts where we designed and wrote technical specifications.
  • Experience on DDL and DML SQL operations where we had designed database and data models which is logical and physical.
  • Experience in inhabiting the data from different source systems as in ODS, Flat files to Data warehouse which are used in creating transformation logic.
  • Analyzed and compared performance of Hadoop, MySQL and SQL Server databases and made recommendations to management.
  • Experience using Workflow Manager for creating sessions, reusable worklets and scheduled workflows and sessions at specified frequency.
  • Focused on efficiency factor where we deducted the cost margin to obtain productive profits and extract data from Oracle and Flat files.
  • Experience in time management for engineering support for both onsite integration and production launch.
  • Good experience working in a team with component developers in analyzing and designing associate product.
  • Experience working with the quality team where work involved in Quality Assurance, Unit Testing, Integration Testing to test the jobs and to monitor the system process flow.
  • Good experience working in different scenarios such as clients changing their requirements at crucial stages where the technical specifications had to be changed.
  • Functional/ business appearances for the parts were to be acquainted.
  • Tremendous experience using automation process to acquire a drastic change to use manual intervention and eventually documenting it.
  • Provided weekly & monthly support batches during the production run.
  • Pro-actively designed where reusable transformations/mapplets were taking into consideration to make it beneficial in the long run.

Environment: Windows 2008, Informatica Power Center 9.1/8.6, Teradata 13.4, Flat files, Oracle 11g, SQL Server 2008R2, SQL Loader, Hadoop and AIX/UNIX shell scripts.

Sr. ETL Developer

Confidential - Detroit, MI

Responsibilities:

  • Strong experience in understanding the overview of the process such as the goals, scheduling, responsibilities, budgeting timelines, resources by broadening the business teams.
  • Enhanced the existing process while performing documentation which includes process flow diagrams, workflow and design strategies.
  • Strong skills on acquiring the requirements for different solicitations, business process, writing its test cases and scripts. Have also created use cases for the future developments.
  • Strong experience in Developing Unix scripts to convert a semi-structured flat file into a structure which is eventually used by Informatica.
  • Strong exposure in writing stored procedures in Oracle 11g for data conversion and loading.
  • Experience in developing maps/workflows/scheduling for its development where Informatica is being used.
  • Experience in writing queries for upgrading SQL server performance using SQL server 2000/2005/2008/2012.
  • Experience in developing complex programs in T-SQL, triggers & queries and writing of stored procedures with efficient execution plan.
  • Good experience in working out all SDLC phases related to extract, transform, and load (ETL) processes using Informatica Powercenter and SQL Server T-SQL stored procedures within SQL Server 2012 environment.
  • Experience in using derived columns, conditions split, data conversion, OleDB command, Term Extraction, Aggregate, Pivot Transformation, Execute SQL Tasks and Script Component Tasks to load the data from XML files in order to create an SSIS package.
  • Experience in getting data from heterogeneous source systems, multi-formatted flat files, excel, XML files into the SQL server database where the SSIS Reusable Package can be created.
  • Extensively worked in ETL and data integration in developing ETL mappings and scripts using Informatica.

Environment: Informatica Power Center 8.1/7.1, Teradata, Oracle 9i, MS Access, Linux, Windows 2003, SQL Server 2000, PL/SQL, Shell Scripts and Hadoop.

ETL Developer

Confidential 

Responsibilities:

  • Strong experience in using tools such as Teradata, DB2, Oracle DB, mainframes, flat files by which we can acquire the relevant data needed to input into the target files.
  • Experience in design attributes where enormous job stages as in Join, Merge, lookup, Filter, datasets, remove duplicates, change data capture, connectors, CDC, ODBC, switch, modifier and aggregators are being used to design jobs.
  • Involved in transforming onerous methodologies in terms of ETL jobs by making use of Informatica tools which resulted in the betterment of efficient interfaces in-between source and target systems.
  • Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, worklets and workflows.
  • Involved in writing stored procedures and shell scripts for automating the execution of jobs in pre and post sessions to modify parameter files, prepare data sources. Designed Target Schema definition and extraction, Transformation and loading (ETL) using mappings according to business rules.
  • Good Experience in finding the root cause for data capture, data cleansing and data movement procedures.
  • Experience in mappings where it acts as a bridge in between the two and we used it to connect the source systems to the target system.
  • Strong experience in the organizing the schedule in a proper manner, undergoing of test, debugging of the desired components and eventually monitoring the executable versions.
  • Data loading procedures and file manipulation task have been successfully completed by using shell scripts in an automated way.

Environment: Informatica Power Center, Oracle 9i, MS Access, Linux, Windows 2003, SQL Server, PL/SQL, Shell Scripts.

We'd love your feedback!