Data Engineer Resume
Las Vegas, NV
SUMMARY:
- Enthusiastic, highly skilled consultant for 9+ years in the field of Data modeling and Data Warehousing(ETL/BI).
- Experienced data modeler/developer, possessing superior technical/analytical skills and the ability to meet deadlines.
- Enjoy sharing technical expertise and coordinating projects with team members.
- Knowledge in all phases of the Data Warehouse Development Life Cycle incuding EDW.
- Have in - depth understanding of dimensional modeling and data mart development including star and snow-flake schemas and slowly changing dimensions.
- Extensive knowledge of Ralph Kimball’s methodologies, Business Process Re-engineering, and Database Design Methodologies.
- Proficient in Conceptual, Third Normal Form 3NF, Operational Data Store(ODS) and dimensional modeling techniques by applying normalization and dimensional concepts.
- Promote and follow Enterprise Architecture standards, guidelines, and best data quality assessment practices integrated with my data base design efforts.
- Experience in fast paced Agile development environment and methodologies including Scrum and test driven development.
- Good exposure to various Software life cycle models like Waterfall, Agile and Rapid Prototyping.
- Experience as a technical lead for Onsite-Offshore team.
- Experience in gathering requirements, design and hands on development in all phases of the project.
- Strong development skills including the ability to work through the entire development cycle from gathering requirements through implementation.
- Data analysis skills - ability to dig in and understand complex models and business processes .Very strong communication and interpersonal skills with people of all levels and roles.
- Development strategies for exposing data in form of Reports, Tiles, Dashoards, Scorecard in COGNOS.
- Development strategies for Extraction, Transformation and Loading (ETL) of data from various sources into Data Marts and data Warehouses using Informatica Power Center. ETL experience includes working with Informatica Power center 7.x/8.x/9.x/10.x on development projects on the UNIX platform.
- Development of Rulepoints and Watchlists in informatia rulepoint and integrate with ETL logic.
- Strong experience in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables and legacy system files.
- Experience in Informatica mapping specification documentation, tuning mappings to enhance the performance, proficient in using Informatica Server Manager to create and schedule workflows and sessions.
- Experience in integration of various data sources like SQL Server, Oracle and DB2.
- Familiarity with SQL and PL/SQL which includes writing Stored Procedures, Functions, Cursors& Triggers.
- Experience in UNIX Shell Scripting.
- Experience in Windows Batch Scripting
- Experience working in cloud environments, Windows Azure.
- Experience in Data virtualization concepts- Denodo.
- Data analysis skills - ability to dig in and understand complex models and business processes
- Strong experience in scheduling Autosys and Control M jobs.
TECHNICAL SKILLS:
Data Modelling Tools: ER Studio Data Architect 9.0, ERWIN, Microsoft Visio
ETL Tools: Informatica 8.6, 9.0.1, 9.1, 9.6, 10.2.0
RDBMS: Oracle 8.x/9.x,10g, 11g, Sql Server 2005/2008,2012,2016, DB2, PostGreSQL 4.1, Teradata
Languages: SQL, PL/SQL, Unix Scripting, Windows Scripting
OS: UNIX, Windows 2000/2003/XP/Vista/Windows 7/Windows 10
Tools: Toad, Rapid SQL, ControlM, Autosys, Putty, FileZilla, WinSCP, SQL Asssitant, SoapUI, XML Spy, Microsoft Powerpoint, Excel, Cognos 9, Tableau, Denodo, Informatica Rulepoint, Ipswitch, IBM Platform Manager, Ultraedit, JIRA
PROFESSIONAL EXPERIENCE:
Confidential, Las Vegas, NV
Data Engineer
Responsibilities:
- Development, enhancement, verification and maintenance of ETLs from various sources to the Mountain West Oracle data warehouse and to external clients.Responsible for design, develop, implement, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center.
- Data integrity and quality control required for the success of Healthcare Economics analytical data deliverables. Created different kinds of audit processes pre/post load for quality assurance.
- Experience with high volume datasets from various sources like Oracle, Text Files, and Relational Tables.
- Verify and validate ETL deliverables. Troubleshoot problems and communicate to the team.
- Translate concepts to requirements, and development into an automated production process. Complete projects and development activities timely and accurately while following the System Development Life Cycle (SDLC).
- Created Oracle packages and stored procedures. Created stored procedures in SQL Server.
- Production deployments, release activities and change management. Created automation deployment scripts.
- Experience in finding the performance bottlenecks and redesigning the ETL process to improve the performance. Fine tuned ETL processes by considering mapping and session performance issues.
- Review and understand the ETL processes, functionality & identify the test requirements and coverage Reviewing Test cases and scripts based on client requirements . Reporting and tracking the defects in Quality Centre .
- Involved in Performance tuning for complex ETL codes.
- Used Power Center Workflow manager for session management, database connection management and scheduling of jobs.
- Document all ETL related work per company's methodology.
- Hands on experience with Windows batch scripting.
Environment: Informatica Power Center 10.2.0 HotFix/9.6, SQL Developer, Windows 10, TOAD 12.8.0, Oracle 11g/12c, Star and Snow-Flake schemas, Microsoft Excel/Access, Flat files.
Confidential, Chandler, AZ
ETL Developer
Responsibilities:
- Worked closely with Business Analyst and the end users for understanding existing business model and customer requirements and involved in preparing the functional specifications based on the business requirement needs.
- Responsible for design, develop, implement, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica power center.
- Worked with Informatica Rulepoint to create rules to check business rules as required and load into oracle database tables. Designed and developed several watchlists and rules.
- Experience with high volume datasets from various sources like Oracle, Text Files, and Relational Tables.
- Developed transformation logic and designed various Complex Mappings and Mapplets using the Designer.
- Experience in working with full SDLC along with Agile Methodology.
- Hands-on experience with Test Cycle Management tool- JIRA.
- Experience in finding the performance bottlenecks and redesigning the ETL process to improve the performance. Fine tuned ETL processes by considering mapping and session performance issues.
- Defined and worked with mapping parameters and variables.
- Migration of code across the environments using Folder to Folder, informatica labels and Deployment Group methods. Troubleshooting of problems in QA and UAT phase.
- Review and understand the ETL processes, functionality & identify the test requirements and coverage Reviewing Test cases and scripts based on client requirements . Reporting and tracking the defects in Quality Centre .
- Involved in Performance tuning for complex ETL codes.
- Extensively worked in Pushdown Optimisation for huge volume of data
- Used Power Center Workflow manager for session management, database connection management and scheduling of jobs.
- Experienced in using the Informatica partitioning feature to load the data into database.
- Document all ETL related work per company's methodology.
- Hands on experience with Unix scripting.
- Working with a global team and responsible for directing/reviewing the test planning and execution efforts of an offshore team.
Environment: Informatica Power Center 10.2.0 HotFix/9.6, SQL Developer, Unix, Windows 10, TOAD 12.8.0, Oracle 11g/12c, Star and Snow-Flake schemas, Microsoft Excel/Access, Informatica Rulepoint 6.2, Ipswitch WS FTP 12.6, WinSCP, Autosys R11, IBM Platform Manager, Putty, Flat files, XML, JIRA
Confidential, AZ
ETL Developer
Responsibilities:
- Designed and implemented the Score Carding, Reporting and Cutom Dash boarding solutions to expose different metrics related to Corrective Actions Programme(CAP) using Informatica ETL and COGNOS. Worked with hierarchial data to create personnel and organizational hierarchies and helped design reporting solutions to retrieve it efficiently.
- Performing as co-coordinator between the business users and the technical team by converting the business requirement to technical designs.
- Extensively worked on ER Studio to create Individual datamart and modifying DDL scripts for existing structure in data warehouse.
- Extensively involved in analyzing/designing solutions of big data volumes in hourly, 15-minute intervals from smart meters including both renwable and non-renewable sources.
- Workerd with APS business partners to consume/provide data Inbound/Outbound.
- Created functional/technical specifications documents.
- Designed and implemented data flow and auditing processes to support reporting services and data analysis.
- Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs into ETL design docs.
- Coordinate with ETL team to implement all ETL procedures for all new projects and maintain effective awareness of all production activities according to required standards and provide support to all existing applications.
- Designed ETL architecture to process large number of files and created High-level design, low-level design documents.
- Extensively used TOAD to test, debug SQL Queries.
- Designed EAI data/process flow logic for movement of data between operational systems(MAXIMO, Supply Chain) using SOAP UI/ REST API.
- Explored data virtualization technologies- Denodo to expose data from multiple channels into cross-system platform.
- Parsing high-level design specification to simple ETL coding and mapping standards.
- Developed mapping designs to generate reports for Revenue Accounting System (RAS) for interfacing month-end close general ledger (GL) accounting data and transactional policy/endorsement detail.
- Experience with finance modules such as P2P.
- Worked on Informatica Power Center 9.6 Tool - Designer, Work Flow Manager, Work Flow Monitor and Repository Manager.
- Performing code review, performance testing, and specification review
- Implementing performance tuning methods to optimize developed mappings.
- Worked on Control M to schedule the Jobs.
- Responding quickly and effectively to production issues and taking responsibility for seeing those issues through resolution
- Worked on Accounting data to do balance sheets, ledgers and classification of marketing and trade data.
- Responsible for migrations of the code from Development environment to QA and QA to Production.
- Carry out Defect Analysis and fixing of bugs raised by the Users
- Involved in support and troubleshooting of production systems as required, optimizing performance, resolving production problems, and providing timely follow-up on problem reports.
- Involved in creating drill-down/drill-through reports, dashboards, scorecards in COGNOS.
Environment: Informatica Power Center 9.1/9.6, SQL assistant, Unix, Windows 7, TOAD, Rapid SQL, Oracle 11g, Star and Snow-Flake schemas, Altova XML Spy, Microsoft Excel/Access, ER Studio Data Architect 9.0, Cognos 9, Tableau, Denodo
Confidential, AZ
ETL Developer
Responsibilities:
- Defined the initial Data Architecture, Data Governance, Data Naming Standards and Conventions, Data Flow Diagrams, Data Requirements and initial deliverable template formats for a new Oracle 11.2 Business Intelligence BI Enterprise Data Warehouse EDW Project.
- Initiated and participated in the Data Modeling tool research, analysis and selection process. Computer Associates CA Erwin 9.xand Embarcadero ERStudio Data Architect 9.7 were considered.
- Involved in requirement gathering, analysis and designing technical specifications for the data migration according to the Business requirement.
- Worked with web services to consume, validate and load data in XML/JSON formats.
- Developed complex mappings and SCD type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations. Created complex mapplets for reusable purposes.
- Involved in optimization and performance tuning of Informatica objects and Database objects to achieve better performance.
- Offered L1 production support for daily jobs.
- Performed Unit and Integration testing and wrote test cases.
- Worked extensively in defect remediation and supported the QA testing.
- Document all test procedures for systems and processes and coordinate with business analysts and users to resolve all requirement issues and maintain quality for same.
- Validate all designs and schedule all ETL processes and prepare documents for all data flow diagrams.
- Hands on exposure on UNIX Environment and experience in using third party scheduling tool, Autosys.
- Develop and manage project and technical documentation
- Complete ad hoc projects, as needed.
Environment: Informatica Power Center 9.1/9.6, Power Exchange, SQL assistant, Unix, Windows 7, Autosys, Star and Snow-Flake schemas.
Confidential, Littleton, MA
ETL Developer
Responsibilities:
- Worked in requirements gathering to build a datamart to provide analysis of patient diagnostic data from multiple hospitals countrywide within Windows Azure cloud infrastructure.
- Worked as business analyst and visited sites in Imagestream Medical domain to participate in meetings with nurses/operators who operate the medical devices responsible for feeding data into Windows Azure cloud environment to better understand/analyse source data and processes/schedules.
- Involved in requirement gathering, analysis and designing technical specifications for the data migration according to the Business requirement.
- Designed ERWIN logical and physical data model of the datawarehouse.
- Created functional/technical specifications documents.
- Exlpored Azure data factory and Hadoop solutions for bulk loading into SQL Server databases.
- Worked with offshore UI team to help extract data by building efficient SQL queries needed for operational teams.
- Worked with Java in parsing data files to upload into data tables.
- Designed SSIS Packages to extract and load data within Cloud environment from pipeline delimeted text files to Windows Azure Cloud- MSSQL Server database.
- Designed multiple staging areas/datamart as needed to effieciently/effectively tune ETL processes.
- Implemented Bitmap/BTree indexes into datatables to efficiently retrive incremental data.
Environment: Windows Azure, SQL Server 2008, 2015, 2016, ERWIN, Windows Batch Scripting, Java.
Confidential, Plano,TX
ETL Developer
Responsibilities:
- Collaborate with all developers and business users to gather required data and execute all ETL programs and scripts on systems and implement all data warehouse activities and prepare reports for same.
- Perform root cause analysis on all processes and resolve all production issues and validate all data and perform routine tests on databases and provide support to all ETL applications.
- Develop and perform tests on all ETL codes for system data and analyze all data and design all data mapping techniques for all data models in systems.
- Provide support to all ETL schedule and maintain compliance to same and develop and maintain various standards to perform ETL codes and maintain an effective project life cycle on all ETL processes.
- Document all test procedures for systems and processes and coordinate with business analysts and users to resolve all requirement issues and maintain quality for same.
- Monitor all business requirements and validate all designs and schedule all ETL processes and prepare documents for all data flow diagrams.Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs into ETL design docs.
- Designed ETL architecture to Process large number of files and created High-level design, low-level design documents.
- Extensively used TOAD to test, debug SQL Queries.
- Involved in support and troubleshooting of production systems as required, optimizing performance, resolving production problems, and providing timely follow-up on problem reports.
- Testing, troubleshooting within IBM Cognos.
- Responsible for IBM and ICM model migration.
- Code migration using IBM congnos from development to testing and production evironments.
- Experience in writing shell scripts.
- Familiarity on writing multiple stored procedures
- Tested the data and data integrity among various sources and targets and associated with Production support team in various performances related issues.
- Extensively worked on creating Control M jobs to schedule the workflow, file executions.
Environment: Informatica Power Center 8.5/8.6, IDE, Oracle 10g, Toad, UNIX, ControlM