We provide IT Staff Augmentation Services!

Etl Architect/lead (sr. Consultant) Resume

5.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • Worked on all phases of data integration and data warehouse development lifecycle, from gathering requirements to development, testing, implementation and support
  • Exceptional background in analysis, design, data modeling, development, customization, implementation and testing of software applications and products
  • Possess good understanding of Ralph Kimball and Bill Inmon modeling Methodologies.
  • Demonstrated expertise utilizing ETL tools, including Pentaho Kettle, Oracle Data Integrator (formerly known as Sunopsis), Ab initio, Informatica and RDBM systems like Oracle, DB2, Teradata, Sybase and SQL Server.
  • Expertise with all ODI tools - Topology, Security Manager, Designer and Operator.
  • Strong Hands on using Pentaho ETL tools - Kettle, spoon and pan.
  • Well versed with ab initio tools - GDE, CO>OP system and EME.
  • Strong leader with experience training junior developers and advising technical groups on ETL best practices and standards.
  • Excellent technical and analytical skills with clear understanding of design goals of ER modeling and dimension modeling.
  • Good Knowledge of OBIEE, Tableau and Sisense BI analytical tools.
  • Experience with waterfall as well as Agile SDLC methodologies.
  • Team player with excellent communication and problem solving skills.
  • Experience with well-known clients in the line of business like insurance/reinsurance, retail, banking, shipping, Logistics, Cable, ISP, manufacturing, etc.

TECHNICAL SKILLS

ETL/ELT: Oracle Data Integrator, Sunopsis, Pentaho Kettle, Ab Initio, Business Objects Data Integrator, Informatica, Datastage, AWS Glue and SQL Loader.

RDBMS: Oracle, MS Access, SQL Server, Teradata V2R3/ V2R4/V2R5, DB2 8.1/UDB, Essbase, Sybase.

Cloud DW: Snowflake, Redshift

Data Replication: ODI & Oracle streams, ODI & Oracle Golden Date

Languages: SQL, PL/SQL, UNIX shell scripting, COBOL, Transact-SQL, Java, C, C++, PERL, Jython, Python

BI: Micro strategy, Cognos, OBIEE, Tableau, Sisense, Power BI

Web: HTML, ASP and JSP.

OS: MS-DOS, Windows, UNIX (Sun Solaris), IBM AIX 5.x/4.x, HP-UX, Linux.

UNIX Editors: Ultra edit, vi.

Database Tools: Teradata SQL Assistant 6.1, Tpump, Fastload, Multiload, Developer 2000 Forms 4.5/5.0/6i, Reports 6i, SQL Loader, TOAD.

Data Modeling Tools: Erwin, Visio.

Schedulers: Appworx, Control-M, Autosys, Cron, ODI scheduler

Cloud Technologies Knowledge: AWS, Oracle RDS, Oracle ADW, Snowflake, Redshift

PROFESSIONAL EXPERIENCE

Confidential, Atlanta, GA

ETL Architect/Lead (Sr. Consultant)

Responsibilities:

  • Leveraged the only existing Production environment to set-up Development and QA environments for ODI ELT system.
  • Initiated and implemented the migration and change management process.
  • Created the templates and standardized the procedures/documentation for unit testing, integration testing and S2T mapping documents.
  • Set-up ODI naming standards and best practices for design, development and testing.
  • Lead the IBM team in performance tuning and streamlining all the loads, which contributed about 30-40% gain in the nightly batch processing time.
  • Lead the Oracle OMCS consulting services team to migrate the ODI ETL application from on premises data center to Oracle Managed Cloud services.
  • Involved in building the EDW as well as Data marts encompassing various subject areas including Production, Sales, Job studies, AR, Inventory, Finance, etc.
  • Designed and developed the Inventory snapshot, Accounts Receivable(AR) DataMart’s from requirements gathering, design, development, testing and implementation
  • Self-learned the new scheduler Appworx and used it to develop ETL jobs orchestration as well as job monitoring scripts.
  • Designed and developed the business critical near real time data replication application using ODI change data capture (CDC) capability running 24/7 and feeding production plants using Oracle streams.
  • Migrated the data replication from Oracle streams to Oracle Golden gate in AWS cloud. Customized the JKM knowledge module pertaining to the particular situation. Provided solution for the PK Update scenario in JD Edwards
  • Customized the ODI Knowledge modules including GG JKM
  • Lead the ECS/Syntax team to migrate the ETL application from Oracle OMCS cloud to AWS cloud as well as ODI 11 g to ODI 12c
  • Performance tuned ETL jobs, queries, resources, etc so that the Replication/ETL workflows are meeting the required SLA’s
  • Performance tuned the ODI agents to use the AWS windows servers optimally
  • Used ODI repository metadata tables to pull trends and use it for job monitoring to troubleshoot login timeout, Zombie process problems and used it to browse large logs where viewing is problematic from ODI 12c operator
  • Supported the data delivery to BI applications in OBIEE and Sisense.
  • Involved in the cloud and modern BI tool evaluations like Snowflake, Qlik Thoughtspot, Tableau, SIsense, etc.
  • Mentored junior developers in ETL, Datawarehousing, SQL and ODI.
  • Conducted design reviews and code reviews to enhance the quality of data integration jobs.
  • First point of contact in supporting all the ETL applications, working with Help Desk, creating and escalating tickets, escalating to Oracle support and implementation companies, etc. to make sure that business is not impacted.
  • Managed the scheduled maintenance as well as unscheduled outages affecting critical Data Replication, Production databases and ETL servers

Environment: ODI 10.x/11.x/12c, Oracle 10g/11g/12c, Oracle Golden Gate, SQL Server Linux, Windows, AWS cloud, AWS Oracle RDS, Appworx 9.2.1, OBIEE 10g/11g/12c, Snowflake, Sisense

Confidential, Philadelphia, PA

ETL Lead/ Sr. ETL Consultant

Responsibilities:

  • Used Pentaho and ODI as an ETL tools to build data warehouse and data marts
  • Worked in the scrum methodology of the agile environment. Involved in the sprint planning meetings.
  • Created the ETL best practices and naming standard documents.
  • Performed extensive data profiling and data analysis. Avoided the data latency issues by proper scheduling of the jobs.
  • Interacted with business to gather the functional requirements, converting those into technical requirements and implementing it.
  • Performed data modeling and dimensional modeling using Erwin. Created logical and physical data models. Worked on designing and implementing the facts, dimensions, ETL control tables, etc. Created star schema for these facts and dimensions.
  • Designed and developed the QA Framework based on Kimball methodology to track/record error records using SQL screens. It consists of the Error Event Fact and few dimensions. This model is very useful in tracking and improving the quality of data without any manual intervention.
  • Installed and configured the ODI in all the environments including all the components - Topology Manager, Security Manager, Designer, Operator, agents etc.
  • Setup the privileges, created users and roles through security manager. Also setup the project and object level security.
  • Modeled the QA migration process using ODI and educated the team about it.
  • Worked with the Web logic admin in setting up the metadata navigator for web viewing of ODI logs.
  • Manipulated large files through UNIX scripting and loaded them into oracle database through ODI.
  • Created various ODI interfaces, packages, scenarios ranging in complexity.
  • Judiciously selected the appropriate Knowledge modules (KM) and tuned them according to the specific needs of the team.
  • Modeled the ETL job orchestration for both ODI and Pentaho jobs. Educated the team on how to use it. Set up the CDC mechanism and incremental approach for STG and PRES layers. Used the batchnum concept for error handling and batch control.
  • Conducted code reviews.
  • Converted the ODI metadata into useful performance/error tracking information by building the performance data mart.
  • Created PL/SQL procedures to support the QA Framework and other ETL activities.
  • Installed and configured the Pentaho kettle on desktop as well as UNIX server.
  • Created various efficient Pentaho jobs and transformations.
  • Sourced data from XML files and integrated into the ETL flow.
  • Integrated web services with Pentaho.
  • Worked on most of the Pentaho components. Configured the Kettle properties. Modeled the Pentaho job to load multiple files.
  • Performance tuned the applications both at ETL layer as well as database. Achieved this through proper designing, ETL tool features, indexes, database partitioning concepts, etc.
  • Created several UNIX KORN shell scripts for Pentaho wrappers, transferring files, splitting files, etc.
  • Handled the “Personal Identifying Information (PII)” data. Used few algorithms’ including MD5 to translate the PII data into non-PII data.
  • Used SVN as a version control tool.

Environment: ODI 10.1.3.5, Pentaho Kettle 4.1/4.2, Oracle 11g 11.2.0.1.0, LINUX, Toad 10.6.1.3, Tortoise SVN 1.6.16, OBIEE 10g, Subversion 1.6.17,Erwin 4.0, Curl, Splunk 4.1.6, XML, SFTP, TERADATA V2R5

Confidential, San Diego, CA

Sr. ETL Consultant/Team Lead

Responsibilities:

  • Installed ODI client, configured the client connection to master repository and work repositories.
  • Setup the Physical, logical architecture and Contexts for different oracle schemas, SQL Server and flat files through Topology Manager.
  • Set up the single login for each data server in topology. Setup the work schema and assisted DBA’s in providing necessary privileges during the topology setting phase.
  • Participated in the infrastructure setup meetings for performance enhancement.
  • Set up the ODI user id access to file server. Created users and profiles based on the roles using security Manager. Created and maintained agents.
  • Defined logical, physical data stores and Models by reverse engineering various source and target systems including flat files.
  • Analyzed data, created technical design documents and mapping documents for various facts and dimensions with required transformation logic, source and target attributes.
  • Designed and translated the efficient ODI scenarios and interfaces based on the existing BODI jobs and transformations.
  • Converted the BODI data flows to ODI interfaces, BODI workflows, BODI jobs to ODI packages and converted the BODI custom functions/transformations to ODI transformations.
  • Developed mappings and implemented several transformation rules for different interfaces using Designer.
  • Modified Knowledge Modules for better throughput of various interfaces and worked on performance tuning.
  • Worked on PL/SQL procedures, functions and control tables.
  • Worked with the off-shore team in India.
  • Lead a team of 4 developers and 2 Business Analysts.
  • Worked along with Oracle consultant in implementing the best practices.
  • Participated in the project management approach meetings and provided input to come with an approach to meet the aggressive deadlines and in allocating work to on-shore and offshore resources.

Environment: ODI 10.1.3.5, Business Objects Data Integrator 11.5.2.0, Topology Manager, Security Manager, Designer, Operator, Oracle 11g 11.1.0.7.0, SQL Server, UNIX AIX, Toad, Flat Files.

Confidential, NYC

Sr. ETL Consultant

Responsibilities:

  • Installed ODI. Set up the ODI connection with Oracle, MS SQL Server and flat files.
  • Set up the Topology including physical architecture, logical architecture and context.
  • Created new models for the data sources - flat files, MS SQL server, Oracle.
  • Worked closely with the Project Manager and Data Architect. Assisted Data Architect in design by doing source data analysis, rectifying the requirement documents, creating source to target mappings.
  • Designed the ETL flow for SUN accounting system to reuse the existing logic of TM1 accounting system to meet the tight deadlines.
  • Coordinated with the offshore UK team. Helped them to define the source view for sun accounting system and in resolving the access issue.
  • Developed interfaces to load the data from flat files, SQL Server to stage and from stage to Oracle HUB.
  • Created ODI packages, scenarios using interfaces, variables, procedure
  • Used ODI commands like ODIFile Move, odiFileAppend, odiFilecopy etc. Implemented logic to archive the source files with date stamp affixed after the successful load.
  • Performance tuned the ODI interfaces. Optimized the knowledge modules to improve the functionality of the process.
  • Performed unit and integration testing. Created various test scenarios to test the application.
  • Delivered the assignments before the deadlines.
  • Conducted code reviews and planned the knowledge transfer to client.

Environment: - Oracle Data Integrator 10.1.3.5, Oracle Database 10g Enterprise Edition Release 10.2.0.4.0, OBIEE 10.1.3.x, TOAD 9.7.2.5, TOAD for SQL Server 4.5.0.854, Java, Jython, XML, MS Excel 2007.

Confidential, Secaucus, NJ

Sr. ETL Consultant

Responsibilities:

  • Did extensive data analysis using source data and UI to understand the data, to assist data modeler in designing the databases, to identify bad data, to help Business in improving and redefining the requirements
  • Worked closely with Business (GBPM). Provided options and alternative solutions to Business during data anomalies etc
  • Involved in CDW and Mart design meetings. Created Technical design documents.
  • Created ODI design documents from the existing Informatica mappings. Used these design documents in development of ODI interfaces/packages.
  • Performance tuned the interfaces, queries through use of indexes, ranks, Oracle hints, exchange partitions etc.
  • Created ODI (Sunopsis) interfaces for loading the data from Sybase to CDW and from CDW to different Marts.
  • Participated in code reviews. Did code promotions from dev to test, staging and production environment using ODI.
  • Wrote unit test cases, test scenarios, unit test documents and also verified the test scenarios of the QA team. Unit tested for all the possible test scenarios.
  • Helped QA team in writing the appropriate test scenarios.
  • Did error analysis for continued improvement of the system by catching data anomalies and code issues.
  • Planning and prioritizing the tasks to meet the deliverable deadlines.
  • Participated in all the phases of project - development, testing, deployment and support.
  • Used Oracle analytic functions, global temporary tables, etc
  • Did SQL and PL/SQL programming for ODI and oracle.
  • Worked extensively with ODI Designer, operator and metadata Navigator.
  • Lead a team of AR and DnD.
  • Involved in release meetings and created release notes.
  • Involved in rotational production support.
  • Complied with the JSOX standards.
  • Interacted with BI team during design and development of facts and dimensions to accommodate all the reporting requirements.
  • Worked on designing and developing data marts, which were used to build Cognos cubes, dashboards and reports. Created AR, DND and Equipment marts.
  • Used Cognos cubes, stand alone reports and drill through reports.

Environment: Oracle 10g, Sybase 12.5, Toad, Oracle Data Integrator 10.1.3.4.7 (Sunopsis V3), DB2 9.1, Informatica 7.x, Cognos 8, OSCAR, Oracle 10g, UNIX, MS Excel 2003.

We'd love your feedback!