- Dynamic professional with 8+ Years of experience in providing Business Intelligence solutions in Data Warehousing for Decision Support Systems.
- Experience in the Analysis, Design, support and Development of Data warehousing solutions and in developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Etl tool along with Big data Hadoop.
- Knowledge of full life cycle development for building a data warehouse.
- Excellent programming skills with ability to automate routine tasks using shell scripting, autosys/Maestro/Informatica/Crontab scheduler/Big data and a good experience in Oracle SQL and Teradata
- Worked on most of the components about 3 years in the Power center of Informatica for supporting, creating, executing, testing and maintaining Mapping in Informatica and also experience with Ab Initio Co - operating System in application tuning and debugging strategies.
- Experience in integration of various data sources with Multiple Relational Databases like Oracle and worked on integrating data from flat files, Teradata tables.
- Exposure to Multifile systems.
- Involved in performance tuning of SQL queries by generating explain plan and checking view point
- Knowledge on using other ETL tools like Informatica/Ab initio/Datastage and reporting tools like Cognos
- Highly motivated, employee focused professional with extensive experience in supporting Coding, deploying, monitoring, Audits and Documentations .
- Highly creative and self-motivated with innovative and effective ideas and concepts for improving efficiency.
- Comfortable interacting with cross cultured people across the globe.
- Energetic personnel known for ability to envision and create successful outcomes in complex / multicultural environment.
ETL Tools: Informatica Power Centre, Datastage, Ab Initio GDE, Ab Initio Co>Operating System
Databases: Oracle 11g, Teradata, Mssql
Programming Languages: C, C++,, HTML
Defect Tracking: Sharepoint logging
Reporting: Cognos 10.1
Operating Systems: Windows NT/2000/XP/98/7, UNIX.
Emailing: IBM Lotus Notes and Microsoft Outlook
- Design ETL application and develop Data warehouse applications based on the technical/functional specifications.
- Involved in meetings to gather information and requirements from the adhoc business users.
- Prepared the Detailed Design Document for the all the modules required for development
- Designed and developed ETL jobs which extract information from Teradata tables, flat files and load them into an Oracle data warehouse using Informatica and Big data Sqoop,Hive
- Coordinate development work with team members, review ETL jobs, and create scripts for scheduling jobs and implementation.
- Involved in creation of proper test data to satisfy all required test cases and performed unit and system integration testing on all deliverables.
- Developed data transformation, loading, scrubbing and extraction programs using Ab Initio ETL tool.
- Designed and Developed the graphs using the GDE, with components partition by round robin, partition by key, rollup, sort, scan, dedup sort, reformat, join, merge, gather, Normalize, concatenate components.
- Also used the components like filter by expression, partition by expression, replicate, partition by key and sort Components
- Worked with Departition Components like Gather, Interleave in order to departition and Repartition the data from Multi File accordingly.
- Create Summary tables using Rollup, Scan & Aggregate.
- Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
- ETL performance enhancement using data-parallelism (m-file), component-parallelism and pipeline-parallelism, in-memory sorting.
- Implemented Partition techniques using Multi file system.
- Involved in writing Shell scripts to create a process involving multiple graphs and to call the .ksh scripts, SQL queries and UNIX commands. The graphs were fully parameterized and the parameters were passed to the graphs as environment variables from wrapper scripts..
- Redesigned the existing graphs and documented all the new and enhancement requests.
- Analyzed the issues with the unmatched records and provided code fix to the problems.
- Deployed and execute Ab Initio jobs on UNIX Environment.
- Good knowledge on UNIX Commands, SQL queries and Teradata Queries etc .
- Involved in Production L3 Support and solved issues like missing files, storage, loading and logical issues.
- As an onsite team lead have to represent my team to meet the Service Level Agreements etc.
- Worked on other ETL tools like Datastage, Reporting tool Cognos and support tool like Remedy for ticket logging and tracking.
- Currently involved in job scheduling using Autosys Scheduler and performance tuning of SQL queries by explain plan
- Intermediate Level knowledge on UNIX shell scripting/Html/Cognos.