We provide IT Staff Augmentation Services!

Sr.informatica Developer Resume

5.00/5 (Submit Your Rating)

Tampa, FL

SUMMARY:

  • 8+ years of IT experience and specialized capability in building Enterprise Data Warehouse, Data Marts, Operational Data Stores in real industry segments like Finance, Health, Insurance and Retail Using Informatica Power Center 9.6.1/9.5.1 /8.6.1/8.1.1 .
  • Experienced with full life cycle of Software Development (Planning, Analysis Design, Deployment, Testing, Integration and Support).
  • Expertise in whole life cycle of DWH with Analysis, Modelling review, ETL Design, Performance improvement, Unit/System testing and support.
  • Strong involvement in Understanding Dimensional Modeling - Star and Snow Flake Schema, Identifying Facts and Dimensions.
  • Having experience in working with different types of OLAP techniques such as MOLAP, HOLAP, ROLAP.
  • Expertise in working with databases Oracle 10g/9i, Teradata 14/12, SQL Server 2000/2005/2008, DB2, My Sql, Sybase.
  • Extensive involvement in creating Stored Procedures, Functions, Views and Triggers, Complex SQL queries.
  • Having solid Experience in Informatica and Teradata mix in Enterprise distribution center environment.
  • Having solid involvement in utilizing Teradata utilities like TPT, FASTLOAD, MULTILOAD, and BTEQ scripts.
  • Developed mappings in Informatica to stack the information from different sources into the Data Warehouse, utilizing distinctive changes like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
  • Experience in determining on-going upkeep issues and bug fixes; observing Informatica sessions and in addition execution tuning of mappings and sessions.
  • Extensive involvement in composing UNIX shell scripts and computerization of the ETL forms utilizing UNIX shell scripting.
  • Worked broadly with CDC & SCD.
  • Experienced in working with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Experience in working with Informatica cloud to extract data from source(cloud) to Informatica Power Center.
  • Experience in writing Perl scripts covering data feed handling, implementing business logic
  • Expertise in implementing performance tuning techniques both ETL & Database level
  • Experience in utilizing Automation Scheduling instruments like Autosys, Control-M and Maestro.
  • Experience in working both Waterfall & Agile Methodologies.
  • Good relational abilities with solid capacity to connect with end-clients, clients, and colleagues.
  • Solid diagnostic and element investigating abilities.
  • Self-persuaded and appreciate working in an in fact testing environment.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6.1/9.5.1 /8.6.1/8.1.1, Informatica Cloud.

RDBMS: Oracle 10g/9i/8i, Teradata 14/12, DB2, SQL Server 2000/2005/2008, MySQL, Sybase

Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, Microsoft Visio, HOLAP, MOLAP, ROLAP.

QA Tools: Quality Center

Operating System: Windows, Unix, Linux

Reporting Tools: Cognos, Business Objects

Languages: Java, XML, Perl Scripting, UNIX Shell Scripting, SQL, PL/SQL

PROFESSIONAL EXPERIENCE:

Confidential, Tampa, FL

Sr.Informatica Developer

Responsibilities:

  • Assist with management of application platforms from a security, capacity, reliability, and overall performance perspective
  • Maintain code efficiency and data integrity in all deliverables
  • Provide leadership in business requirement definition for new projects
  • Provide leadership in customer service to our business community for change requests and issue research
  • Define, test, implement, and govern technical standards for the overall EAI development framework.
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.
  • Wrote BTEQ scripts to transform data and wrote FASTEXPORT scripts to export data.
  • Extensively worked in the performance tuning of Teradata SQL, ETL and other processes to optimize session performance.
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Loaded data in to the Teradata tables using Teradata Utilities BTEQ, Fast Load, Multi Load, and Fast Export, TPT.
  • Compliance to SDLC and SOX in all aspect of application development.
  • Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.
  • Responsible for Impact Analysis, upstream/downstream impacts.
  • Created detailed Technical specifications for Data Warehouse and ETL processes.
  • Got trained and hands on experience in Informatica cloud in migrating data from Informatica cloud to Informatica Power Center.
  • Responsible for implementing different techniques of OLAP such as MOLAP, ROLAP, HOLAP for analyzing the data.
  • Write Perl scripts for converting the end-of-record delimiter in a flat file source to end-of-record followed by a new line character that can be run as a pre-session command.
  • Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
  • Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Successfully upgraded Informatica 9.1 and to 9.5 and responsible for validating objects in new version of Informatica.
  • Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.
  • Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
  • Integrated the data into centralized location. Used migration, redesign and Evaluation approaches.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Scheduling Informatica jobs and implementing dependencies if necessary using Autosys.
  • Managed postproduction issues and delivered all assignments/projects within specified time lines.

Environment: Informatica Power Center 9.6.1/9.5.1, Informatica Cloud, Oracle 11g, DB2, Teradata, Flat Files, Erwin 4.1.2, Sql Assistant, TOAD, Perl, Winscp, Putty, Autosys, UNIX, Padre.

Confidential, Covington, KY

Sr. Informatica Developer

Responsibilities:

  • Involved in creation of HLD and LLD document based on BR/FR and got those reviewed by customers along with detailed project time lines.
  • Involved in building the ETL architecture using Informatica 9.6.1/ 9.5.1 and Source to Target mapping to load data into Data warehouse.
  • Providing periodic update to customer on the coding, unit testing, release and act as Coordinator between development and business team.
  • Responsible for implementing different techniques of OLAP such as MOLAP, ROLAP, HOLAP for analyzing the data.
  • Used Teradata Administrator and Teradata Manager Tools for monitoring and control the system.
  • Installed Teradata drivers for the Teradata utilities.
  • Refreshed the data by using Fastexport Fastload utilities.
  • Used Teradata Administrator and Teradata Manager Tools for monitoring and control the system.
  • Multiload, BTEQ, created & modified databases, performed capacity planning, allocated space, granted rights for all objects within databases, etc.
  • Creating and modifying MULTI LOADS for Informatica using UNIX and Loading data into IDW.
  • Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica.
  • Creating and modifying MULTI LOADS for Informatica using UNIX and Loading data into IDW.
  • Performed Proof of Concept for Informatica Data Quality (IDQ).
  • Worked with different Teradata utilities like BTEQ, multi load, fast load and fast export.
  • Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
  • Performed data quality analysis, gathered information to determine data sources, data targets, data definitions, data relationships, and documented business rules.
  • Worked on loading Flat Files in to Data warehouse.
  • Created Stored procedures, collections and packages.
  • Created mapping documents to outline data flow from sources to targets. Parsed high-level design specification to simple ETL coding and mapping standards.
  • As a De facto Release/Configuration coordinate configuration/release management related efforts.

Environment: Informatica 9.6.1/9.5.1, Data Quality 8.6.1 (IDQ), Teradata, DB2, Oracle, Flat Files, Maestro, UNIX, Padre Perl, Windows, Sql Assistant.

Confidential, Plano, TX

Informatica Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement, design, development, testing and support for production environment.
  • Extensively used Informatica Client tools like Informatica Repository Manager, Informatica Designer, Informatica Workflow Manager and Informatica Workflow Monitor.
  • Used Teradata utilities FASTLOAD, MULTILOAD, TPUMP to load data.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
  • As a Teradata developer, responsible for maintaining all DBA functions (development, test, production) in operation 24×7.
  • Created Sources, Targets in shared folder and developed re-usable transformations, mapplets and user defined function (UDF) to re-use these objects in mappings to save the development time.
  • Developed mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created mappings which involved Slowly Changing Dimensions Type 1 and Type 2 to implement business logic and capturing the deleted records in the source systems.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
  • Experience with high volume datasets from various sources like Oracle, Text Files and Teradata Tables and XML targets.
  • Used debugger extensively to identify the bottlenecks in the mappings.
  • Modified PL/SQL stored procedures for Informatica mappings.
  • Created Sessions and Workflows to load data from the SQL server, flat file and Oracle sources that exist on servers located at various locations all over the country.
  • Configured the session properties i.e. high value for commit intervals to increase the performance.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Involved in Migrating the Informatica objects using Unix SVN from Dev to QA Repository.
  • Worked on developing workflows and sessions and monitoring them to ensure data is properly loaded on to the target tables.
  • Responsible for scheduling workflows, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.
  • Performance tuning on sources, targets mappings and SQL (Optimization) tuning.

Environment: Informatica Power Center 8.6.1, Flat Files, Oracle 11g, Teradata 12/13, SQL, PL/SQL, TOAD, SQL Assistant, Windows XP, Unix, Perl, Maestro.

Confidential

ETL Developer

Responsibilities:

  • Design and develop ETL mappings and workflows using Informatica
  • Analyze, design, build, test and document complex data conversion and integration solutions
  • Design and develop Extract Transfer and Load (ETL) Code to migrate and integrate data from disparate data sources into Hadoop for specific projects
  • Work with business groups to develop and implement business rules to be used for ETL processes
  • Manage team code standards, integrity of code base, enforce proper source control and re-usability of extract and load code as well as various standard transformations that will apply to all systems
  • Establish policies and best practices for optimizing ETL data throughput/accessibility; create and maintain business intelligence and data migration design principles using industry best practices

Environment: Informatica Power Center 7.1.1/8.1.1, Oracle, Reports 25, SQL, PL/SQL, Windows NT, Erwin, UNIX

We'd love your feedback!