We provide IT Staff Augmentation Services!

Programmer Analyst Leader Resume

Foster City, CA

SUMMARY

  • Over 12 plus years of strong experience in Analysis, Design, Data modeling in Data warehousing and Web Based applications Development.
  • Oracle 9i DBA & PL/SQL Developer Certified Associate.
  • Strong experience in ETL Development using Hadoop, Ab Initio Co>Op 3.1.2 GDE 3.1.2, Data Stage 8.1 and ODI 10.1.3 & Oracle 9i/10g/11g and PL/SQL
  • Extensive experience with Data Migration, Data Transformation, Data loading, Data Lineage and Data Dependency using Oracle, Teradata, DB2 Databases and Files
  • Experience in Design and modeling of Dimensions, fact tables and Star & Snow - Flake Schema and used Slowly Changing Dimension (SCD) techniques
  • Strong Programming skills in UNIX Shell Scripting, Tables, Indexes, Synonyms, Constraints, Materialized Views, Oracle Bulk features, Collections, Stored Procedures, Function, Packages, Triggers and Autonomous transactions
  • Experience with Oracle development using PL/SQL,SQL Developer,TOAD,SQL Loader, Data pump and Export/Import
  • Extensively used Teradata Data warehouse using utilities, Fast Load, MultiLoad and BTEQ Scripts Tools
  • Proficiency in Performance Tuning of SQL Queries and Views using EXPLAIN PLAN, TKPROF utilities, Indexes and COLLECT STATISTICS
  • Extensive experience in Scheduling tools - Crontab,Autosys and Control-M
  • Experience with working in an onshore/offshore model
  • Extensively developed and implemented reports using Crystal Reports
  • Additional knowledge in Informatica 8.x and Reporting Tools Business Objects and Oracle Discoverer
  • Result oriented, committed and hard working with excellent communication and interpersonal skills a quest to learn new technologies and undertake challenging tasks.

TECHNICAL SKILLS

Operating Systems: SUN OS 5.6, Solaris 2.X, UNIX (Solaris, HP, and AIX), MS Windows NT/98/95/2000, and MS DOS

Databases: Oracle 9i /10g/11g, SQL Server 2000/2005, DB2/UDB and MS-access, Teradata

Data warehouse Tools: Hive, Pig,Ab Initio 3.1.2, DataStage 8.1, Informatica Power Center 7.x/8.x, ODI 10.1.3

BI & Reporting: SSRS, Crystal Reports 9, Business Objects XI, Oracle Discoverer

Data Modeling: Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Entities, Attributes, Cardinality, ER Diagrams, ERWIN 4.x, Oracle Designer 2000

Languages/Web Tech: C, C++, Java, XML, Visual Basic 5.0/6.0, PL/SQL, Shell Programming, Autosys, JSP, VB Script, Java Script

Development Tools: Oracle9i Forms/Reports, Visual Interdev and Visual Source Safe 6.0, Caliber, Dimensions, Control-M, Mercury Quality Center, Clearcasse, Web logic, Trillium 7.6

PROFESSIONAL EXPERIENCE

Confidential, Foster City, CA

Programmer Analyst Leader

Responsibilities:

  • Working on Confidential Enterprise Data Warehouse system project, responsible for design, develop BI reporting solutions using Ab Initio 3.2.5, Oracle PL/SQL, Hive, Pig, Hadoop streaming and Micro strategy Suite.
  • Working with business users to collect business requirements and understand business processes, responsible for prepare ETL Design and functional specifications.
  • Interact with higher management and business analysts to create new reporting data models and business intelligence reporting, involved in designing FACT and DIM models. Responsible for identifying new data sources to analyze data.
  • Leading a development team and responsible for on-time delivery of allocated DW project/project modules.
  • Developed High volume data processing using Ab Initio, Hive and Pig with performance tuning components.
  • Developed several KShell programs, functions and packages to do the pre - inspection, post - inspection, pre - extraction inspection of the data getting loaded every week and populate errors, which are again used to generate error reports responsible for the automation of Ab-Initio graphs using KShell scripts.
  • Extracted data source from MVS & files and processed loaded into DB2 UDB system.
  • Worked in dependency analysis of graphs and creating sand box and EME Project directory.
  • Co-ordinate with different testing groups to accommodate their testing data requirements and translate them to data selection criteria.
  • Involved setting up environment in Control-M and installing builds
  • Conduct the reviews, Code Walk-through and Tracking the progress of work
  • Supported QA and UAT by preparing the environments, providing execution instructions, troubleshooting issues, and resolving defects
  • Involved in troubleshooting performance issue and failures in all environments.
  • Mentoring team members with process and trouble shooting issues.

Environment: Abinitio GDE3.1.2, Co>Operating System 3.2.5, Micro strategy, Hive, Pig, Impala, Hadooop streaming,Sqoop, UNIX,, DB2, Oracle Exadata

Confidential, Florida

Lead ETL Developer

Responsibilities:

  • Worked in Merrill lynch Data Conversion and Data Management Team on Data Extraction, Fictionalization, Subset, Data Cleansing, and Data Validation
  • Involved in all the stages of SDLC during the projects. Analyzed, designed and tested the new system for performance, efficiency and maintainability using ETL tool AB INITIO and DataStage
  • Responsible for requirement gathering, analysis and development for data synchronization with Brokerage, Annuity and Sales NFS Trades data.
  • Co-ordinate with different testing groups to accommodate their testing data requirements and translate them to data selection criteria in Ab-Initio and Data stage format
  • Worked on legacy account conversion from ML account to BOA in cleansing to implementation
  • Developed several KShell programs, functions and packages to do the pre - inspection, post - inspection, pre - extraction inspection of the data getting loaded every week and populate errors, which are again used to generate error reports responsible for the automation of Ab-Initio graphs using KShell scripts.
  • Batch processing for data downsizing(subset)
  • Worked to Production support daily/monthly cycle execution from data prospects.
  • Involved in troubleshooting AbInitio/DataStage performance issue and failures in all environments.
  • Document ways to automate manual processes and created JIL script - Autosys jobs for scheduling the jobs
  • Maintaining sandbox by storing all the work in a sequential order.
  • Developed UNIX shell scripts for the purpose of parsing and processing data files. Maintained and did trouble shooting for batch processes for overnight operations.
  • Extensively used Teradata Data warehouse using utilities, Fast Load, MultiLoad and BTEQ Scripts Tools
  • Created various PL/SQL stored Procedures, functions triggers for ad hoc reports and batch processing
  • Involved in documentation of the entire project. Mentored other team members.
  • Co-ordinate with development for future changes in the file or table structures to accommodate future testing requirements
  • Managed a combination of project, technical and managerial skills with proven abilities in solving complex problems, while exceeding performance expectations. Also cooperated and communicated among architects and business groups to achieve common business goals

Environment: AbinitioGDE1.14.26, Co>Operating System 2.14.62, DataStage 8.1, UNIX, Oracle 11g, Teradata 6.0, Autosys, TOAD SQL and Maximo

Confidential, Florida

Sr. Oracle/ETL Developer

Responsibilities:

  • Worked on Enterprise Data warehouse project FDW (Financial Data warehouse) project, responsible for design, develop BI reporting solutions using ODI, BAM and Oracle SOA Suite. Worked on creating ETL processes using ODI to populate data into staging and data warehouse tables and analysis cubes.
  • Installed and configured ODI 10.1.3.5, Oracle 10g & Oracle BIEE in the Window environment
  • Worked on migration projects, that includes database migration from SQL Server 2000 to Oracle 11g, and consolidation of databases. Worked on migrating DTS packages from SQL Server 2000 to ODI Interfaces in Oracle 11g.
  • Worked with business users to collect business requirements and understand business processes, responsible for prepare functional specifications and documentation. Interact with higher management and business analysts to create new reporting data models and business intelligence reporting, involved in designing FACT and DIM models. Responsible for identifying new data sources to analyze data.
  • Created Designed Documents with Data Preparation, Data Procession and Data Completion Stage.
  • Prepared Cannonical, Staging, and Aggregate table for SQL Server and Mainframe Source system
  • Written number of stored procedure in PL/SQL also worked on developing custom development PL/SQL scripts like functions, Materialized view and Triggers, to identify data issues during the process of raw data and loading into staging databases.
  • Worked on Oracle Data Integrator tools - Security Manger, Topology Manager, Designer and Operator.
  • Used different ODI Interfaces for Extraction/Loading/Transformation, data consolidation, data Integration and data cleansing
  • Created ODI Interfaces using ODI Knowledge modules for extracting the data from different sources
  • Extensively used ODI designer to create and manipulate source data stores, target data stores, packages, interfaces, procedures, variables
  • Utilized different knowledge modules as per mapping rules to load data from SQL Server and legacy systems
  • Implemented ODI packages and Scenarios for scheduling purposes and created several ODI solutions to migrate code from one environment to another. (Dev to Test and UAT to Prod environment)
  • Used ODQ (Oracle Data Quality) for Data Quality Process to standardize, Cleanse and De-dupe Business Related Data and to correct the erroneous data.
  • Configured Secure NDM and SFTP on Servers for transmission of files across multiple systems and NTFS Tape Backups/Restores using Veritas

Environment: ODI 10.1.3, Oracle 11g, SQL Server 2005, BAM, Oracle SOA Suite, PL/SQL, UNIX, Autosys

Confidential, Denver

Sr. ETL Developer

Responsibilities:

  • Involved in ETL design, coding using Ab Initio ETL tool in order to meet requirements for extract, transformation, cleansing, and loading of data from source to target data structures
  • Prepared the estimation, design effort and SDLC documents for the enhancements
  • Lead a team and responsible for on-time delivery of allocated DW project/project modules.
  • Assigned work to offshore team and coordinated the project status and Mentor offshore developers in best practices
  • Involved in Designing of Star-Schema and Dimensional Modeling with ER Diagram using Erwin.
  • Created complex transformations and multifile system in using aggregate, scan, normalize, parallelism, rollup, de-normalize, conditional dml,sequence generator, lookup, joiner and stored procedure transformations
  • Extracted source data from DB2, SQL Server 2005, Oracle 10g, JCL COBOL, XML and flat files of customer accounts and loaded to a relational data warehouse and flat files. The data was extracted and written to Oracle through MPP.
  • Extensively used SCD’s (Slowly Changing Dimension) to handle the Incremental Loading for Dimension tables, Fact tables.
  • Provide performance tuning of Ab Initio graphs by employing Ab Initio performance components like Lookups (instead of joins), In-Memory Joins and rollups to speed up execution.
  • Modified Ab Initio graphs to utilize data parallelism and thereby improve the overall performance to fine-tune the execution times by using multi file systems and lookup files whenever required.
  • Used Trillium(Converter, Parser, Geocoder, Matcher) to cleansing customer addresses for matching the data and used key generation and XML for data modeling and transformations
  • Used Oracle Hints to tune the performance of embedded Oracle SQL Queries as well as tuned using EXPLAIN PLAN, TKPROF utilities, Indexes and COLLECT STATISTICS
  • Developed the Unit test case and maintained in Quality Center and Setting up the environment for Integration Testing, which includes simulating complex production systems in the Integration test servers. Used SFTP to transfer files from remote system This includes setting up environment in Control-M and installing builds

Environment: Oracle 10g/11g, SQL*Loader, PL/SQL, TOAD 8.6, SQL Server 2000/2005, Teradata, DB2, Ab Initio Co>Op 2.13,2.14 GDE 1.14,Plan>It,UNIX,Autosys,XML,Informatica 7.1,Erwin, Crystal Reports, Trillium 7.6(Converter, Parser, Geocoder, Matcher), MS-VISIO 2003, Calibrer, Dimensions, CONTROL-M, Quality Centre

Hire Now