We provide IT Staff Augmentation Services!

Sr. Application Developer Resume

Plano, TX

SUMMARY

  • About 14+years of experience in Information Technology,ETL, BI, Database, Data warehouse development and Data Modeling along with 2+year of experience in Big data ecosystem related technologies.
  • Designed and Developed complex mappings like SCD type1 and type2 applications using Informatica Power Centre Designer.
  • Experience in troubleshooting the Informatica related components.
  • Good knowledge on Talend ETL Architecture.
  • Experienced in integration of various data sources with Multiple Relational Databases like Oracle, Teradata, SQL Server and Worked on integrating data from flat files like fixed width and delimited.
  • Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center.
  • Good Experience of Data warehouse concepts and principles (Kimball/Inman) - Star Schema, Snowflake, SCD, Surrogate keys, Normalization/De normalization.
  • Strong in developing data models including Logical, Physical, Conceptual and additionally dimensional modeling using star schema for data warehousing projects.
  • Hands on experience in application development using RDBMS, SQL and Linux shell scripting.
  • Wrote Shell Scripts forInformaticaPre & Post Session Operations.
  • Extensively used Informatica Repository Manager and Workflow Monitor.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Strong administrative working knowledge of Informatica Data Quality and related sub-products
  • Knowledge on Informatica Products EDC(Enterprise Data Catalogue), MDM and Data Lineage.
  • Hands on experience in Performance Tuning of sources, targets, transformations and sessions.
  • Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
  • Experience in using the Informatica command line utilities like pmcmdto execute workflows in non-windows environments.
  • Perform Informatica platform upgrades and implement new features of the toolsets.
  • Experience in installation/configurationofInformatica.
  • Informatica administration that include creating users, connections, folders and managing privileges.
  • Worked with Business Objects andData Services
  • Worked with Oracle Stored Procedures, Triggers, Cursors, Indexes and Functions.
  • Strong Knowledge in optimizing database performance.
  • Highly Motivated to take independent responsibility as well as ability to contribute and be a productive team member
  • Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode and MapReduce programming paradigm.
  • Experience in analyzing data using HiveQL, good knowledge in Pig Latin and custom MapReduce programs in Java.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.

TECHNICAL SKILLS

ETL Tools: Informatica 10.x/9.x/8.x1 (Power Center/Power Mart )

Hadoop /Big Data: HDFS, MapReduce, Hive, Pig, HBase, Sqoop, Flume, Oozie

Data Modeling: Erwin 4.0/3.5, Star Schema Modelling, Snow Flake Modelling.

Databases: Oracle 11i/10g/9i/8i, MS SQL Server 2005/2000, DB2, Teradata v2r6/v2r5

SAP Tools: ECC 6.0, 4.X, BW3.X

OLAP Tools: Business Objects6.5/XI/R1/R2, OBIEE

Programming Languages: C,C++, JAVA, ASP, C# and .NET 3.5

Languages: SQL, PL/SQL, Unix Shell Script, Visual Basic

Tools: Toad, SQL* Loader

Operating Systems: Windows 2003/2000/NT, AIX, Sun Solaris, Linux

PROFESSIONAL EXPERIENCE

Confidential

Sr. Application Developer

Responsibilities:

  • Work in a fast-paced agile development environment to quickly analyze, develop, and test potential use cases for the business.
  • Used Informatica PowerExchange to register the tables and imported the Source tables for PWX Sources from different sources like AWS RS power adapter Enterprise, Salesforce, Oracleand DB2 in Informatica PowerCenter Designer.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Modified existing mappings for enhancements of new business requirements.
  • Injected data from S3-Raw Bucket and loaded into Impala ODS and Hive (Big data).
  • Injected data from S3-Raw after refining loaded into S3-Refine Bucket in the form of Slices and by using copy Command loaded into Redshift Database.
  • Debugged and resolved load failures by verifying the log files. Supported QA Testing in fixing the bugs and also helped to resolve various data issues.
  • Involved in writing the UNIX Shell Scripts, which triggers the workflows to run in a particular order as a part of the daily loading into the Warehouse.

Environment: Informatica 9.6.1/10.1,UNIX, AWS(RS), (Big data)Impala,Hybris, SFDC,SQL server, Robot, BO and Cognos.

Confidential, Plano, TX

Sr. ETL Engineer

Responsibilities:

  • Work in a fast-paced agile development environment to quickly analyze, develop, and test potential use cases for the business.
  • Working with both onsite and offshore team of ETL Developers and ETL Analyst.
  • Experienced in troubleshooting the Informatica related issues.
  • Automation and scheduling of Informatica Workflows using Autosys.
  • Created commonly used Mapplets in Informatica Power center.
  • Extracted data from heterogeneous sources like Flat files and RDBMS tables and transformed and loaded into ExaOracle database.
  • Used Type 1 SCD, Type 2 SCD and CDC mappings to update slowly Changing Dimension Tables.
  • Tuned Informatica mappings and sessions for optimum performance.
  • Created BIg data POCS for Clients who needed help in migration.
  • Working on Big data Frame work ( EZ Flow ) for sourcing the data and used HIVE
  • Good knowledge in AWS services like S3 bucket and Lambda.
  • Load the customer data into AWS by using Jason files.
  • Coordinated with business and development teams for closure of UAT defects.
  • Identified the bottlenecks in the source, target, mapping, and loading process andsuccessfully attended/resolved the performance issues across this project.
  • Tuned Informatica load performance by finding bottlenecks inInformatica mapping level and Session level.
  • As part of Data Quality analysis: Traced Data Lineage of data elements to be used in the CSDR.
  • Researched new and existing data sources in order to contribute to new development, improve data management processes, and make recommendations for data quality initiatives.
  • Determined root cause for data integrity gaps resulting from previously uncontrolled migrations in order to provide appropriate data resolution and remedy process.
  • Used Debugger to test the mappings and fixed the bugs.
  • Used existing ETL standards to develop these mappings.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Having requirement discussions with Offshore Team.

Environment: Informatica 9.6.1/10.1,UNIX, AWS,TOAD, Oracle, SQL server, Actimize, Cognos, Autosys, Hadoop Cloudera 4.3, MR, Hive, Pig, sqoop,oozie

Confidential, Plano, TX

Data Modeler/Sr.Developer

Responsibilities:

  • Performed extraction, transformation and loading of data using different types of stages and by performing derivations over the links connecting these stages.
  • Worked on Data Lineage using Informatica Metadata Manger in order to maintain Data Governance (DG).
  • Performed debugging on these jobs using Peek Stage by outputting the data to Job Log or a Stage.
  • Responsible for monitoring/ troubleshooting jobs during the production of data load processes.
  • Involved in documentation for Design Review, Code Review and Production Implementation.
  • Designed conceptual data model based on the requirement, Interacted with non-technical end users to understand the business logics.Modeled the logical and physical diagram of the future state, Delivered BRD and the low level design document.
  • Discussed the Data model, data flow and data mapping with the application development team.
  • Involving with business users for identifying, prioritizing & resolving numerous data issues, create ETL Project Plans, design and development
  • Handling both development and support teams and automated the processes to reduce the manual efforts.
  • Involved in data profiling and data modeling.
  • Interacted with Business Analysts to finalize the requirements and documented the technical design document for Coding.
  • Worked with SCD stage for implementing slowly changing dimensions.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
  • Preparation of technical specification for the development of Extraction, Transformation and Loading (ETL) jobs to load data into various tables in Data marts.
  • Imported the required Metadata from heterogeneous sources at the project level.

Environment: ER/Studio Data Architect, Informatica 9.5.1,UNIX, Oracle,DB2 and Control-M

Hire Now