We provide IT Staff Augmentation Services!

Senior Datawarehouse/ Etl Developer Resume

NJ

SUMMARY:

  • Twelve plus years of IT experience in Design, Development and Implementation of Data Warehousing and Reporting projects.
  • Experience in all phases of the development life cycle which includes requirements gathering/analysis, design, development, validation, fine - tune & testing the data warehouse projects.
  • Used ETL tools Informatica/SSIS, ERWIN for Physical and Logical Data Modeling and Cognos and Business Objects for reporting.
  • Proficiency in data analysis and creating data model structures for enterprise data aiming at both OLTP and OLAP decision support systems.
  • Expertise in dimensional modeling supporting DSS creating star / snowflake schemas identifying fact and dimension tables.
  • Strong experience with multiple databases such as Oracle, SQL Server, DB2,Teradata and Sybase.
  • Experience working in different industries like Financial, Pharma and Health Care.
  • Experience in Marketing Campaigns, Data Quality and Data Analysis.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center/ Power Mart 9.6/9.5/9.1/9.0/8.6/8.1/7.1/6.2/5.1, SSIS (SqlServer Integration Services) 2010

Data Modeling Tools: Erwin 7.2/4.0/3.5, Oracle Designer 10g R2

Data Quality Tools: Oracle Silver Creek Enterprise Data Quality EDPQ

Reporting Tools: Cognos 8x/7x, Business Objects XI R2

Databases: Oracle 11g,10g/9i/8i/7.x, DB2 Mainframe on OS 390, Post Gres Sql, Sybase and SQL Server 2000/2005/2010

Languages: SQL, C, PL/SQL

Version Control: CVS, Git Hub, SVN

Operating Systems: MS Windows 98/NT/2000/XP, UNIX, IBM-AIX, Sun Solaris and Linux

Tracking Tools: Clear Case, Clear Quest, Quality Center

Miscellaneous: JIRA, TOAD and SQL Navigator, Pl/Sql Developer, Sql Developer, Post Gres Sql MS-Office suite, SQL*Loader, MS Project 2005

PROFESSIONAL EXPERIENCE:

Confidential, NJ

Senior Datawarehouse/ ETL Developer

Responsibilities:

  • Collaborating with Business Analysts for requirement gatherings and analysis.
  • Prepare the Data design documents and Data mapping documents based on the specifications.
  • Involved in Data Procurements from Internal and External Clients.
  • Led the team of developers for different project deliverables.
  • Designed and develop the ETL Data Flow for the Designated Broker Dealer Feeds (DBD) and Actimize Models.
  • Working with Disparate source systems involving Internal as well as External Systems, like DBD’s (Designated Broker Dealers), DMS (Deal Management system), GTR, Research Supervision group.
  • Source system Interactions with external vendors like Goldman Sacs, Ameriprise Financials, Barclays. Internal systems to Bank like Research Supervision group (RDR), GTR.
  • Involved in CSDR (Compliance and Surveillance Repository) Tables Design and development which is the main platform for AIM.
  • Procurement of New feeds from external vendors supplying data for the Actimize models.
  • Involved in the Deployments and code migration and actively taking part in Releases.
  • Involved in Multiple feeds like COBOL, Dat files, Connecting to other Databases like SQL Server and DB2 AS400 and also Real time transactions via MQ setup.
  • Involved in Fixing the Defects raised by the Business users. Impact Analysis of the Defect Fixes.
  • Involved in set up of external source feed via ML Clear setup.
  • Design and Develop the SCDType2 ETL s to support the Actimize Model Generation.
  • Extensive Use of UNIX Shell scripting and automating the daily feeds via Autosys Job scheduling tool.
  • Perform Unit testing at various Levels of Development.
  • Involved in the Data Analysis in the initial stages of Data Procurement to identify the Bottlenecks.
  • Design and Develop the ETL Mappings from End to End. Data from various feeds in the form of files and Relational objects and populating the Staging environment followed by CDM Data mart workflows and CSDR Workflows.
  • Using Informatica Power center (9.Xversions) to accommodate the CDSR Warehouse and CDM Data mart data flow.
  • Using the Informatica Power center creating complex mappings and used various transformations to accommodate the business requirements.
  • Develop the ETLS on CDM (AS400).
  • Involved in extensive use of heterogeneous Databases like Oracle Exadata and DB2 on AS400.
  • Modifying the Existing Mappings as per the Enhancements requests from Business Users.
  • Code Promotion done based on ETL Standards, created Mapping Design documents to facilitate the data flow between the sources and staging and from staging to Target.
  • Involved in Post Production Support and Validation.
  • Coordinating with offshore teams.

Tools: Informatica 9.0/9/1/9.5/9.6, Oracle 11g Exadata, Toad, DB2, AS400, Red Hat Linux, MS Excel, Squirrel

Confidential, PA

Senior Datawarehouse/ ETL Developer

Responsibilities:

  • Experience with requirements gathering and translating business requirements into Conceptual and Logical Process and Data Models.
  • Functional analysis with business group and preparing technical design specifications for ETL components.
  • Working with Statisticians helping them providing with Market Trends and Projections to gauge the market compared to the competitive manufacturers.
  • Designed and created the Data marts for different business product categories.
  • Define ETL coding and Mappings based upon the high level design documents.
  • Extensive use of Informatica different transformations to accommodate the business rules, data coming from flat files, as well as different relational databases.
  • Creating sessions, workflows, automate the jobs to run on scheduled basis, followed by email job notifications.
  • Using oracle procedures, packages to accommodate some legacy reports supplied to the various clients like Pfizer, Novartis and Merial.
  • Performance tuning of the long running SQL queries.
  • Performance tuning of mappings during UAT stage and maintenance phases.
  • Tracking Production failures and debug the sessions by utilizing the logs of the sessions.
  • Perform Code review and unit testing at various levels of the ETL.
  • Involved in migrating the Mappings, Sessions, Workflows from Dev to Test, Test environment to Production environment.
  • Integration of UNIX shell with Informatica PMCMD command utilities.
  • Modified existing mappings for enhancements of new business requirements.
  • Using SCD type I to update slowly Changing Dimension Tables.
  • Involved in Marketing Campaigns which used SSIS as the platform to target the clients, sending the campaigning emails on behalf of the pet clinics to the customers.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Post production maintenance of the daily deltas.
  • Worked independently as well as in a team.

Tools: Informatica 9.1., SSIS, Oracle 11g, Sql Developer, Pl/Sql developer, Post Gres Sql, Oracle Silver creek Data Quality, Data lens Builder, MSExcel, Red hat Linux, JIRA, Git .

Confidential, NY

Senior Datawarehouse/ ETL Developer

Responsibilities:

  • Consolidated and integrated data into data services to accommodate the business requirements.
  • Requirement gathering for the different Data Feeds.
  • Created logical/physical data model using Erwin.
  • Using Informatica Power Center Designer analyzed the source data to Extract & Transform from various source systems (oracle 11g, flat files, csv, spl, xls) by incorporating business rules using different objects and functions that the tool supports.
  • Using Informatica Power Center created mappings and mapplets to transform the data according to the business rules.
  • Worked on the enhancements of the oracle packages for the data movement of CORE, Reptnt and FGR, using the various triggers.
  • Develop, test, debug the ETL mappings for various project like MVCR, CSD-FI, FINRA, SP indices other non rating domains.
  • Code Promotion done based on the ETL standards, created test plan for the QA team, Mapping design documents to facilitate the data flow between the sources and target databases.
  • Involved in the upgrade of the Informatica from version 8.1 to 9.1
  • Implemented the SCD changes accommodating the business rules.
  • Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
  • Automate the daily/weekly/Monthly jobs using the standard Unix Shell script functions, scheduling them as cron jobs.
  • Later moving all these cronjobs to Autosys commands to avoid frequent manual intervention of the developers.
  • Following the Agile methodology and sprint session to accommodate and show the Management the development and progress of current working projects.
  • Involved in monitoring the workflows and in optimizing the load times.
  • Created re-usable sessions, mappings, and transformations to avoid creating multiple objects in the repository.
  • Used Web services in order to pull data from intermediate services for the ETL process.
  • Automated Workflows create Pre- & Post- Session Commands; run the scheduled jobs using UNIX Shell Scripting.
  • Have excellent analytical, problem solving, communication and interpersonal skills, with ability to interact with individuals at all levels and can work as a part of a team as well as independently.
  • Prepared the standard operating procedures and trained other team members.
  • Worked with the Global offshore teams, have good onsite, offshore coordination.

Environment: Informatica Power Center 8.1/9.1, Oracle 11g/10g, SQL/PLSQL, Toad, Erwin, Windows, AIX, Red Hat Linux, MS Office.

Confidential, NJ

Informatica/ Cognos BI Consultant

Responsibilities:

  • Gather the requirements from Scientists and analyze the data according to the requirements.
  • Designed the logical and data models.
  • Built the data marts.
  • Designed and built the Extraction, Transformation, and Loading (ETL) modules based on the requirements.
  • Developed complex queries which support the business requirements.
  • Developed Stage, Test and Production environments.
  • Scheduling the night loads.
  • Designed and developed the reports according to the requirements using cognos.
  • Developed Frame work Manager Models, built the Packages.
  • Developed the complex reports for scientists using Cognos Report Studio.
  • Created Adhoc reports for the end users.
  • Report building using Report Studio and Analysis Studio .
  • Created pivoted reports for end users as per the business requirement.

Environment: Informatica Power Center 8.6, Oracle 10g, SQL/PLSQL, Toad, SQL Navigator, Erwin, Windows, Red Hat Linux, MS Office, Cognos 8.4, Unix Shell Scripting.

Confidential, West Point, PA

Data Warehouse/ ETL Consultant

Responsibilities:

  • Collaborating with business analysts and the technical team for requirements gathering and analysis.
  • Preparing the high level design document to provide an overview of the technical design specification
  • Review the overall system design documents for sign off on integration of the database components.
  • Provide recommendations for changes and enhancements at different stages of the project and accommodate database management change requests.
  • Using technical transformation document to design and build the Extraction, Transformation, and Loading (ETL) modules based on the requirements.
  • Loading data coming from different sources like market research groups and also data coming from sales force automation tool.
  • Writing procedures to handle data loads from source systems to the centralized Repository.
  • Develop ETL routines using Informatica Power Center and created mappings with the usage of Lookups, Aggregator, Ranking, Expressions, Mapplets, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers.
  • Using the workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process.
  • Troubleshoot problems by checking sessions and workflow logs.
  • Writing UNIX shell scripts extensively for scheduling and pre/post session management
  • Responsible for performance tuning the processes by identifying and optimizing source to target mapping and session bottlenecks
  • Preparing the Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for the Maintenance and Operation of the application
  • Providing data loading, monitoring, system support and general trouble shooting necessary for all the workflows involved in the application during its production support phase
  • Review the reporting requirements and design end-user reports using the Cognos reporting tool.
  • Create automated scripts and Cognos reports to validate the data and the transformations.
  • Communicate with the reporting team in timely manner to produce results.

Environment: Informatica Power Center 8x, Oracle 10g, SQL/PLSQL, Toad, Erwin 7.2/4.2, Windows XP Professional, MS Office, Unix Shell Scripting, Cognos 8x.

Confidential, PA

Junior Data Warehouse Consultant

Responsibilities:

  • Involved in collecting and understanding the requirements from internal trading systems throughout the project
  • Responsible for developing the technical specifications document based on the business requirements
  • Data model was developed using reverse engineering
  • Identified entities, attributes and relationships and converted the conceptual model to the logical model
  • Analyzed the data to establish cardinality between the entities of the logical data model
  • Designed and developed the environment for the Staging area, and extracted data from multiple sources from fixed income tradings, derivatives and equity systems
  • Front end trades are generated in flat files
  • All the data obtained from front office and back office is reconciled
  • Debugged invalid mappings using break points, testing stored procedures and functions, testing data load jobs, batches and target data
  • Identified bottlenecks and tuned performance of data mappings and sessions for large data sets by implementing pipeline partitioning and increasing block size, data cache size, sequence buffer length and target based commit interval
  • Developed PL/SQL procedures for creating/dropping of table and indexes of performance for pre and post session management
  • Executed UNIX shell scripts for data loading, job scheduling and SQL script execution
  • Administered and scheduled the nightly loads of data
  • Optimized the mappings and encouraged the usage of reusable objects like mapplets to reduce run time
  • Drafted the Service Level Agreement for production support to downstream applications that is regulatory compliance
  • Worked with regulatory in delivering data on timely basis for further reporting
  • Developed Business Intelligence reports to track the Finance Metrics using Cognos

Environment: Informatica 8X, Erwin 4.4, cognos 7.x, Oracle 9i, MS office, Windows NT, TOAD, SQL * Loader, XML, MS Visio

Hire Now