Etl Developer Resume Profile
5.00/5 (Submit Your Rating)
SUMMARY:
- Overall 12 years of experience in Software Design/Development Project Management.
- Expert in using Actimize Visual Modeler 4.9 RCM designer 5.1 , Informatica Power Center 9.01/8.1.1, Informatica Data Quality, PL/SQL, UNIX, Shell scripting,, Erwin Data modeling tool, and Informatica Metadata Manager.
- Expertise in ETL Architecture, Error Handling Strategies, Unit Testing and Systems Integration Testing.
- Expertise in developing Actimize flows and execution plans and configuring AIS Instances and RCM repositories.
- Designing Alert Types and setting Thresholds and Inclusion/Exclusion lists.
- Good Experience in managing a team of developers with an extended support from offshore
- Project Planning , implementation and monitoring
- Coordinating with Senior management on project timelines and resource requirement
- Leverage technical, business and financial acumen to communicate effectively with client executives and their respective teams.
- Expert in agile and waterfall project management methodologies. Able to manage large project teams and known for high-quality deliverables that meet or exceed timeline and budgetary targets
- Excellent Troubleshooting and Analytical skills
- Strong knowledge and hand-on experience of Data Analysis, Data Profiling and Data Modeling.
- Comfortable in DWH Technologies like Metadata Management, Dimensional Modeling, Star Schema data modeling, Snow flaking, Late arriving dimensions, Rollup and aggregation strategies and data mart creation.
- Extensively involved in End to End solution design involving data modeling and design of end to end extract transform and load processes with data coming from heterogeneous source systems to target systems.
- Performance Tuning of ETL mappings using best practices like push down optimization, avoiding memory and CPU intensive transformations like Aggregator and Ranker.
- Performance Tuning of SQL and PL/SQL code using techniques like Bulk Collection, Merge Statement, inline views, parallelization, using Hints etc.
- Good experience in implementing SQL analytical functions such as rank ,row number .
- Good experience in implementing table partitioning, materialized views with Query rewrite facilities available in ORACLE RDBMS.
- Implemented version control using SVN, manage the version control for the project.
- Scheduling application related jobs UNIX,Informatica, SQL through Autosys and ControlM.
- Ensuring Informatica best practices based on velocity guidelines.
- Strong experience working with clients to gather and analyze business requirements.
SKILLS Summary:
- Technical -System Design Development:
- Requirements Analysis, ROI Analysis, Project Scheduling, Gap Analysis, Design, Development and Implementation, Data modeling, Systems Engineering, System Migrations/Integrations, Enterprise wide Implementations
- ETL Design / Architecture, Development, Testing,QA, Rollout Support,Tuning and Scheduling, SQL Tuning, Exadata Migration
- Languages, Software and Tools:
- Informatica Power center 9.x, 8.x, 7.x, Informatica Data Quality, Actimize Visual Modeler and RCM designer, C, Erwin Data Modeler, UNIX, SQL Loader, Toad and PL/SQL, Control-M, Autosys
- RDBMS: Oracle 10g/9i and Exadata, DB2, SQL Server
- Operating Systems: UNIX, Windows XP/2007
Representative Project EXPERIENCE:
Confidential
About Project
- Compliance Surveillance Data Warehouse is the Enterprise wide Compliance DataWarehouse of Confidential We get data from 100 upstream trading application and books Records. These systems are on diverse platforms and hence the interface to our system is also very heterogeneous. Some systems send data through plain delimited flat files, some use message queues, some mainframe based systems send highly denormalized COBOL files and for some systems we have hookup directly to their databases such as SQL server, SYBASE and DB2. Using Informatica we integrate the data in Datawarehouse. On this Datawarehouse we run 70 Compiance and Surviellance actimize models. These surveillance models catch potential fraudulent activities or regulatory violation such as Confidential The Alerts are accumulated in RCM and users can track them there. Users can also do random analysis and research using DART tool.
- We support many critical applications such as
- 150 Surveillance models, coded using Actimize
- Take Over Panel
- Equity Aggregation Data Mart
- These applications have very critical SLAs in terms of data availability. This requires our ETLs and Actimize models to be of highly performance tuned which is very challenging considering the volume of data. We have employed various specialized features of Informatica such as persistence cache, multi partitioned sessions, multiple instance of same workflow running in parallel. On Actimize side used balanced use of active table and DB SQl stored procedure calls. We also employed lot of database level tuning techniques as well such as, partition pruning, materialized views, using performance enhancing SQL hints. We are also in the process of migrating from Oracle 10g to Exadata 11g. This is to increase performance and meeting huge data retention requirement.
- In the broader context presented above my responsibilities can be summarized as below
Role Senior Data Architect ETL Actimize
Responsibilities
- Responsible for designing and implementing complete Actimize work flows to cater surveillance requirement and coordinating with business users to get UAT signoff on critical solutions. Managing and Guiding a team of developers with an extended support from offshore for the implementation of solution designed Designing Alert Types using RCM Designer Setting up Thresholds, inclusion/exclusion lists in RCM Repository for various surveillance models and making sure that the link is properly established in corresponding AIS code for the model Setting up Dev/QA and Production AIS Instances and assigning proper port numbers and number of connections per instance and other parameters in ais config.xml file. Setting proper Staging area, data and metadata folders and their location on AIS server Translating business rules related to catching fraudulent transactions and compliance violations in actimize detection flows Project Planning, implementation and monitoring Coordinating with Senior management on project timelines and resource requirement Design the job schedule and get it implemented using Autosys Guide release management team to ensure seamless code migration and version control using SVN of the developed code ETL Design and fine tuning the ETL code for performance gain Fine tuning the PL/SQL code for performance Design the Autosys jobs and schedule the data load strategy Verify the test plan and test results prepared by testing team Data Modeling using Erwin Analyze defects raised by users and fix them
- Leverage technical, business and financial acumen to communicate effectively with client executives and their respective teams.
- Work with 3rd party vendors such as Informatica, oracle and Actimize to sort out any tool level issues
Confidential
Role - Sr. ETL/Informatica Lead
Responsibilities-
- I worked for Confidential central Datawarehouse containing the research and drug development Data, known as Data Information Factory. My responsibilities were to create Informatica mappings based on the requirement specification created by business analysts. To diagnose and fix any production defects was also my responsibility. Among other things I used Informatica metadata manager to do impact analysis of any changes done by us upstream. We also used Trillium Discovery to do source data profiling and business rules validation in terms of data.
- Technical Environment: Informatica PowerCenter 8.1.1, Informatica Metadata manager, Sun Solaris 32-bit, Oracle 9i, Trillium Discovery
- Employer Tata Consultancy Services
Confidential
Role - Sr. ETL Developer
Responsibilities
- Work with BAs/users to finalize the requirements. Create High level low level design for the ETL process Do the data profiling and analysis on source data Evaluate the quality of data and workout strategies to handle Late arriving dimensions and conforming values Chalk out extensive error handling and reporting strategy and launch informatica and DB jobs. Design and implement complex ETL mappings to full fill business requirements. Creating shell scripts to preprocess the source provided data files. Fine tuning the PL/SQL code for performance Fine tune the ETL code for performance gain Work with release management team to ensure seamless code migration and version control of the developed code Coordinate with offshore team to get the code developed and meet project milestones. Work with 3rd party vendors such as Informatica and oracle to meet tool level challenges Design Implement the Control-M and schedule the data load strategy. Verify the test plan and test results prepared by testing team , coordinate with users to get UAT signoff on deliverables Analyze defects raised by users and fix them
- Environment: Informatica Power center 8.1.1, Informatica Data Quality. Unix shell scripting, UNIX AIX 64-bit, Sun Solaris 32-bit, Oracle 10g, Control-M
Confidential
Role - ETL Developer
- Responsibilities Design and develop ETL solution using Informatica, Oracle and UNIX, scripts. Create Unit test plan and perform Unit testing, support UAT and QA on the developed solution. Debug any defects either ncountered in UAT or in production and fix them.
- Environment: Informatica Power center 7.4, UNIX AIX 64-bit, Oracle 10g, Unix shell scripting Shell Scripting using AWK/ SED, AppWorx