We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

4.00/5 (Submit Your Rating)

Minneapolis, MN

PROFESSIONAL SUMMARY:

  • IT professional, 8 years experience in System Analysis, Design and Development and around 8 years of experience in IBM Infosphere, Certified IBM Datastage v8.5/8.1 using Components like Administrator, Manager, Designer, and Director.
  • 2 years’ of experience in SAS programming and SAS Business intelligence suite.
  • 2 years’ of experience in leading technical team.
  • Experience in gathering requirements, preparing functional/technical specifications documents and interacting with the users.
  • Expertise in Dimensional data modeling, Star schema modeling, Snow - Flake modeling, identification of fact and dimension tables, Normalization, Physical and Logical data Modeling using ERwin and Oracle Warehouse Builder to implement Business Intelligent systems.
  • Experience in Parallel extender jobs, Sequencer jobs and Server Jobs in DataStage.
  • Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML.
  • Proficiency in data warehousing techniques for slowly Changing Dimensions, Surrogate key assignment and Change Data Capture.
  • Proficient in working with Teradata, Oracle, Access, SQL Server, and DB2 Databases.
  • Hands on Experience with SAS programming, SAS BI tools.
  • Extensive experience in development of UNIX Shell scripts for file management, Scheduling ETL Jobs, Reject Reprocessing and SFTP processes.
  • Knowledge and understanding about MSBI - SSIS, SSAS and SSRS.
  • Extensively used SQL coding for overriding the generated SQL in Datastage and also tested the data loaded into the data base.
  • Proven track record in troubleshooting of DataStage Jobs and addressing production issues such as performance tuning and enhancement.
  • Excellent in Documentation, End-User, Knowledge Transfer and Production Support.

TECHNICAL SKILLS:

RDBMS: Oracle 11g/10g/9i, DB2 7.x/8.x/10.x, MS Access 2010, SQL server, Teradata 12/14

ETL Tools: Datastage 8.x, QualityStage 8.x, SAS Enterprise studio 5.1, SAS Data Integration Studio 4.6

BI Tools: Cognos 6.x, Business Objects 4.x/5.1, SAS Web Report Studio, SAS Information Map 4.3, SAS OLAP Cube studio 4.3, SAS Dashboards, SAS IDP

Programming: SQL, PL/SQL, Base SAS 9.2, Python

Environment: Sun Solaris 5.x, 2.x, AIX 5.x, HP-UX B.11.31, UNIX, Korn Shell, Bourne Shell, Novell NetWare, Win 95/98, XP, Win NT 4.0/2000, Linux 2.6.9-55, Fedora Linux 10, Red hat Linux

PROFESSIONAL EXPERIENCE:

Confidential, Minneapolis, MN

Sr. ETL Developer

Responsibilities:

  • Designed and developed the ETL flow for Prior Auth, MDA and Specialty Reporting.
  • Handled many other fast track projects in parallel to the above projects.
  • Reviewing the code of other developers for design issues, naming standards.
  • Leading junior developers by providing detailed ETL flow.
  • Involved in data modeling session; Reviewed data models; Analyzed data using SQL queries and identified issues which lead to changes to Teradata datamodels.
  • Designed and Implemented data quality rules dynamically thru table driven approach; Written stored procedures in Teradata.
  • Extensively leveraged the use of BTEQ scripts in Teradata.
  • Expertise in Loading Transaction time and valid time temporal tables and views in Teradata.
  • Written a UNIX script to create views in Teradata instead of manually creating the views.
  • Passed SQL’s to datastage as parameter where SQL is maintained in a file.
  • Modified existing job designs for better performance reducing the runtimes.
  • Create jobs and sequencers with zero warnings and Cleanup the warnings from existing jobs.
  • Created a UNIX script for registering the input files received and generate a process registration ID.
  • Created reusable jobs and sequencers which are driven using parameters, parameter sets and schema files.
  • Scheduled datastage jobs using scheduling tool ROBOT and created job events, monitors and implemented looping of ROBOT jobs based on reactivity.
  • Ran datastage jobs thru scheduler in all environments and monitored using datastage director.
  • Followed ETL standards in naming, design and development of Datastage jobs.
  • Created Unit Test documents and signoff documents.
  • Maintained sync of code and parameter files between all environments like sandbox, dev, test, qa and prod.
  • Unit testing the code and support QA team.
  • Participating in UAT with client and having good communication skills.

Environment: Datastage 8.7, DB2, Teradata 14.10, UNIX Shell Scripting, Putty, WinSCP, ROBOT scheduler, SVN subversion, SQL Server 2012.

Confidential, Plainsboro, NJ

Sr. ETL Developer

Responsibilities:

  • Responsible for gathering and assessing business information needs and preparing system requirements.
  • Designed the process for reconciliation of AGS project.
  • Built queries and datastage jobs which reconcile the spend data across landing, staging, transaction and cegedim databases which will be reported to State and Federal agencies.
  • Create ETL Technical design documents and mapping documents.
  • Understand the existing process to create new or modify existing process as per the requirements.
  • Create jobs with zero warnings and Cleanup the warnings from existing jobs.
  • Created a new UNIX script for reading multiple file names from 18 sources on file server and loading into tables.
  • Modified existing UNIX scripts as per changed requirements and also removed existing UNIX scripts which are not needed by process.
  • Created individual sequencers for 18 source systems and also a master sequencer for all.
  • Used parameter sets and eliminated the unwanted parameters from the jobs.
  • Created and used stored procedure in datastage to update the errors in multiple tables to resolve.
  • Sending email notification on success or failure of the reconciliation process based on a flag value in table.
  • Dynamically creating the email subject and email body with complete details of the spend data.
  • Creating reports and attaching them in an email providing exact reason for failure of process.
  • Unit testing the process and support QA team.
  • Single member on team managing own tasks and deadlines.
  • Participating in UAT with client and having good communication skills.

Environment: Datastage 8.7, Oracle, UNIX Shell Scripting, Putty, WinSCP.

Confidential, Tallahassee, FL

ETL Lead Programmer (Datastage/SAS)

Responsibilities:

  • Responsible for gathering and assessing business information needs and preparing system requirements.
  • Create ETL mapping documents to build data warehouse and data mart.
  • Migration of code from Datastage 7.5.1 to Datastage 8.5.
  • Work with various transformation stages in Datastage 8.5 and create new parallel jobs to increase performance compared to the jobs running in Datastage 7.5.
  • Use Datastage 8.5 and SAS DI studio to create ETL jobs according to business requirements and mapping documents to move the data to the target.
  • Work on various transformation stages in Datastage 8.5 and create new parallel jobs to increase performance compared to the jobs running in Datastage 7.5.
  • Register External CSV, flat files and XML files.
  • Register different database servers and its Databases using SAS management console and SAS DI studio.
  • Used User manager in SAS management studio to create new user, accounts and assign users to different accounts.
  • Used different inbuilt stages in SAS DI to create a job flow from source to target like change data capture, data quality stages and Control stages.
  • Used SAS OLAP Cube studio to create and register cubes.
  • Used SAS IMap studio to create IMap for based on reporting requirements.
  • Used SAS Web report studio to create reports using IMap and also stored processes.
  • Used SAS Dashboard and SAS IDP to integrate all the web report studio reports and provided drill down action.
  • Modify existing databases and database management systems or direct programmers and analysts to make changes.
  • Work as part of a project team to coordinate database development and determine project scope and limitations.
  • Write and code logical and physical database descriptions and specify identifiers of database to management system or direct others in coding descriptions
  • Interacts with Quality Assurance team for prototyping solutions, preparing test scripts, and conducting tests and for data replication, extraction, loading, cleansing, and data modeling for data warehouses.
  • Maintains knowledge of software tools, languages, scripts, and shells that effectively support the data warehouse environment in different operating system environments.
  • Ensures compliance to standards through code walkthroughs and metadata reports.

Environment: Datastage 8.5, SAS 9.2, SAS Enterprise studio 5.1, SAS Web Report Studio, SAS Information Map, SAS Dashboards, SAS IDP, Visio, Microsoft SQL Server, UNIX Shell Scripting, Microsoft Team Foundation Server, Microsoft Test Manager, WinSCP.

Confidential, Jersey City, NJ

Sr. Datastage Developer

Responsibilities:

  • Used the Datastage Designer to develop processes for Extracting, Cleansing, Transforming and Loading data into SAP using Idoc’s.
  • Used different stages of Datastage Designer like Lookup, Join, Merge, Funnel, Filter, Copy, Aggregator, Sequential files, Datasets, Sort, Remove duplicate and Transformer with different functions.
  • Developed Datastage Jobs, sequencers with NO warnings and also redesigned already existing code for Zero warnings.
  • Created Project Level Parameter Sets and its Value files for different environments.
  • Code reviewed Datastage Jobs and UNIX scripts.
  • Used Hash, Entire Partitioning methods among different Partitioning Techniques.
  • Used Datastage Director for validating, execution, monitoring Jobs, Processes and check the log files for errors.
  • Written UNIX Script to pick the most recent file based on timestamp in filename from remote server.
  • Responsible for running of DI processes using Autosys in PROD and Pre-Prod.
  • Responsible for fixing the issues which are raised due to failure of DI processes.
  • Responsible for fixing defects that are part of Hyper-care defect resolution.
  • Modified and added processes to the Autosys JIL’s to schedule them, which were not scheduled before or which didn’t exist.
  • Identify the cause of failure of Autosys/Datastage processes and notify the concerned with its root cause.
  • Used Datastage Designer for importing metadata into repository, for importing and exporting jobs into different projects.
  • Performed unit testing of all monitoring jobs manually and monitored the data to see whether the data is matched.
  • Used Datastage Director and Peek stage for debugging.
  • Written JIL’s in autosys for scheduling of Datastage Jobs and DI processes.
  • Used several stages While developing Parent, Child Sequencers like Start Loop, End Loop, User Variable Activity, Nested Condition, Routine Activity, Exception handler, Abort notification, Wait for file and Mail Notification stages to build an overall master Sequencer and to accomplish Re-start ability.

Environment: IBM Information Server Datastage 8.1, Windows Vista Enterprise, Visio, AIX, SAP, Putty, WinSCP.

Confidential, Chicago, IL

Sr. Datastage Developer

Responsibilities:

  • Migrated Datastage Jobs from Datastage 7.5 to Datastage 8.0.
  • Using Functional Document (FD) prepared Detail Design Document (DDD) for the development of Datastage Jobs.
  • Used the Datastage Designer with complex oracle queries to develop processes for extracting large amount of data from Oracle EDW.
  • Used different stages of Datastage Designer like Lookup, Join, Merge, Funnel, Filter, Copy, Aggregator, Sequential files, Datasets, Sort, Remove duplicate and Transformer with different functions.
  • Developed New Datastage Jobs, sequencers with NO warnings and also redesigned all ready existing code for Zero warnings.
  • Designed and coded the automation of SFTP process which completes in less than 10 mins, previously consumed 4 hrs of IT time every day.
  • Created Project Level Parameter Sets and its Value files for different environments.
  • Written UNIX scripts for SFTP of files which updates its SFTP status in database and notifies the business after success or failure.
  • Used Hash, Entire Partitioning methods among different Partitioning Techniques.
  • Used Datastage Director for validating, execution, monitoring Jobs, Processes and check the log files for errors.
  • Integrated all the subject areas of application into a master sequencer which is scheduled monthly.
  • Written most complex Oracle queries which setup’s Plan Configuration for PBM business.
  • Used PL/SQL to write stored procedures with functional blocks, cursors to extract the members based on business requirement.
  • Updated tables using PL/scripts through Datastage jobs.
  • Used Orchadmin for managing datasets in file system when space is full or ds are corrupted.
  • Used Datastage Designer for importing metadata into repository, for importing and exporting jobs into different projects.
  • Performed unit testing of all monitoring jobs manually and monitored the data to see whether the data is matched.
  • Used Datastage Director and Peek stage for debugging.
  • Worked with QA team, supported them in running the Jobs, analyzing data with complex logics.
  • Used several stages While developing Parent, Child Sequencers like Start Loop, End Loop, User Variable Activity, Nested Condition, Routine Activity, Exception handler, Abort notification, Wait for file and Mail Notification stages to build an overall master Sequencer and to accomplish Re-start ability.

Environment: IBM Information Server Datastage 8.1, Windows Vista Enterprise, Visio, AIX, Oracle 10g/11g, TOAD, SQL developer, Tidal 3.2.1.4466, Atlassian tools, Putty, WinSCP.

Confidential, Peoria, IL

Sr. Datastage Developer

Responsibilities:

  • Used Information Analyzer to gather and analyze the Technical metadata characteristics of the data.
  • Using Interface Design Document (IDD) prepared Technical Design Document (TDD) for the development of Datastage Jobs.
  • Used the Datastage Designer to develop processes for Extracting, Cleansing, Transforming and Loading data into Teradata EDW.
  • Used different stages of Datastage Designer like Lookup, Join, Merge, Funnel, Filter, Copy, Aggregator, Sequential files, Datasets, Modify, and Sort.
  • Used CDC (Change Data Capture) Stage to Capture the New records and updated records and Implemented SCD type 2.
  • Created Project Level Parameter Sets and its Value files for different environments.
  • Used Teradata Enterprise Stage for Teradata Multiload and Used Teradata Connector, ODBC Connector Stages for looking up against DB2 and Teradata Views.
  • Used Hash, Entire Partitioning methods among different partitioning Techniques.
  • Used Datastage Director for validating, execution, monitoring Jobs, Processes and check the log files for errors.
  • Scheduled Jobs using TIDAL scheduling Tool.
  • Used PL/SQL scripts to extract the required data for Source as well as lookup.
  • Written stored procedures, Triggers to update the oracle tables.
  • Used Datastage Designer for importing metadata into repository, for importing and exporting jobs into different projects.
  • Performed unit testing of all monitoring jobs manually and monitored the data to see whether the data is matched.
  • Used Log in Datastage Director and Peek stage for debugging
  • Supported Testing Team in SIT by assisting them with Functionality as well as Datastage tool.
  • Worked in UAT supporting client in Preparing Grief analysis Report and Data Validation using Business Objects Reports.
  • Developed UNIX scripts to automate the Data Load processes like moving files to Execution folder after identifying the correct file’s to read and Creating a header and footer to file after success full completion of Job.
  • Used several stages While developing Parent, Child Sequencers like Start Loop, End Loop, User Variable Activity, Nested Condition, Routine Activity and Mail Notification stages to build an overall master Sequencer and to accomplish Re-start ability.
  • Assisted both the DEV and Testing OFFSHORE teams.

Environment: IBM Information Server Datastage 8.1, Windows Vista Enterprise, Visio, Linux 2.6.9-55, Oracle 11g, Teradata 12.0.0.09, DB2, Teradata SQL Assistant 12.0, Tidal 3.2.1.4466, HP Quality Centre 10.0,SAP Business Objects, SAP.

Confidential

ETL Analyst Programmer

Responsibilities:

  • Designed various procedures to get the data from all operational systems in to Data Warehouse.
  • Extensively used ETL to load Access data in to Oracle database.
  • Designed and developed various PL/SQL scripts to meet the business requirements.
  • Developed stored procedures using Oracle to automate the data loading process.
  • Involved in design, source to target mappings between sources to operational staging targets, using DataStage Designer.
  • Used DataStage Director to run and monitor the jobs for performance statistics.
  • Used Erwin for data modeling (i.e. modifying the staging and SQL scripts on Oracle and MS Access Environments).

Environment: Ascential DataStage 7.5 (Designer, Manager, Director), Parallel Extender (PX), Oracle 9i, MS Access, SQL, PL/SQL, Toad, UNIX

We'd love your feedback!