- Positive and self - motivated professional with 8 years of working experience in Oracle PL/SQL development in business domain such as Banking and Financial Services, Insurance and Telecom.
- 3 years of working experience in SAS Programming.
- Hands on experience in software project life cycle right from requirement analysis till post implementation and warranty support on BAU mode
- Good knowledge on logical and physical Data Modeling using normalization Techniques.
- Experience in Data warehouse concepts, ETL and SCD
- Hands on experience in database design, development, performance tuning, Data Integration and Data Migration.
- Expertise in PL/SQL development and implementation of db objects such as Stored Procedures, Functions, Packages, Triggers, views, materialized view, cursors, ref cursors etc.
- Good hands on experience in migrating data from flat files to Oracle tables using SQL *loader, External tables.
- Proficient in advanced features of PL/SQL programming like Collections and Dynamic SQL & bulk collect
- Expertise in collections, dynamic SQL, and bulk collect techniques.
- Experience in handling JSON and XML data formats
- Good understanding of oracle Exadata platform concepts.
- Experience in SAS/Base, SAS/Macro programming and debugging in UNIX environment.
- Experience in utilizing SAS Functions, Procedures, Macros, and other SAS application for data updates, data cleansing, and reporting.
- Extensively used PROC Reports and SAS ODS facility to generate reports in RTF, PDF, HTML formats.
- Experience in creating application jobs in Autosys scheduler & Control M.
- Good Knowledge in Bigdata development technologies Hadoop, Map Reduce, HDFS, HIVE, HBase, Yarn, Sqoop and Oozie.
- Experience in job scheduling tools like Control M, Autosys for creating, scheduling, and monitoring jobs
- Good team player with experience in working in Agile projects execution and completion of tasks within sprint timelines
Operating systems: UNIX, MS Windows
Languages: Oracle PL/SQL, SAS/Base, SAS/Macros, Ksh scripting
Database: Oracle, Teradata, MYSQL, SQL Server, HBase.
Tools: SAS 9.4 Enterprise Guide, TOAD, SQL developer, SQL Loader, SQL*Plus, Clarify CRM 6.0, Super PuTTy, ServiceNow, Visio, JIRA, Confluence, WinSCP
Job scheduler: Control-M, Autosys, Cron tab
Hadoop/Big Data Tools: HDFS, Yarn, Hive, Sqoop, Hue and HBase.
Version control: Tortoise SVN, GIT- bit bucket.
PL/SQL Developer / SAS Programmer
Confidential, Wilmington, DE
- Performing Oracle PL/SQL code changes for online account card portal.
- Creation and modification of objects such as Tables, Packages, Procedures, cursors, Ref cursors, views, Triggers, DB links on business requirement changes.
- Develop materialized views for data replication in distributed environments
- Load data from flat file to oracle db using External tables, SQL Loader.
- Develop/Modify SAS codes as per new enhancement requirements on business rules
- Have done SQL/PLSQL performance tuning using Explain plan, Dbms Profiler and TKProf.
- Created indexes on the tables for faster retrieval of the data to enhance database performance.
- Extensively used hints to direct the optimizer to choose an optimum query execution plan.
- Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
- Experience in writing complex SQL queries.
- Develop SAS codes using base SAS procedure such as Proc SORT, PROC SQL, PROC PRINT, PROC IMPORT, PROC EXPORT and debugging errors in UNIX environment.
- Use LIBNAME and SQL PASSTHRU facility to read data from other DB sources like Oracle, Teradata.
- Implemented AUTHDOMAIN method of authentication in connecting SAS to databases via metadata server as a part of security controls to eliminate username and password authentication method
- Developed SAS macros and modifying pre-written macro to accommodate the changes on business logic when needed.
- Use SAS Hash object technique to integrate the data sets and lookups in data step to improve the performance.
- Created customized reports in SAS using SAS ODS facility.
- Perform UAT and get signoff from Product owner and Business user.
- Involved in creating Data Lake by extracting Big Data from various data sources into Hadoop HDFS. This included data from Excel, Flat Files, Oracle, Teradata and HBase.
- Involved in data migration from RDBMS (oracle, Teradata) to distributed solution with Hadoop Clusters using - Cloudera (CDH5).
- Design Data flow in Hadoop starting from Data ingestion to compression using SQOOP into HDFS and done conversion, transformation and dynamic partitioning, bucketing using Impala, Hive.
- Design OOZIE workflow for data transfer between RDBMS (Oracle) and Hive.
- Stored data from HDFS to respective Hive tables for further analysis in order to identify the Trends in data
- Setting up new and testing jobs in Autosys scheduler for SAS, Unix Shell Scripts.
- Developed, exercised, and documented test plans to ensure developed code meets specifications.
- Created various sample dataset in testing environment &conduct UAT/Review for sign off.
- Performing production deployments for ADHOC code change as per business requirement.
- Working on ADHOC queries for various application reports generated on a daily/weekly or monthly basis
- Requirement analysis and design of models based on the discussion with application owners and business users.
- Developed SAS codes to generate user reports (standard data files) as per business requirements and conversion of excel reports to SAS datasets. Testing and debugging the same.
- Created database objects like Tables, Views, Stored Procedures, Functions, Packages, DB triggers, Indexes.
- Used Bulk collect techniques (BULK COLLECT, FOR ALL, Save Exception, sql%Bulk exception) to improve performance for loading and updating large data set with reduced context switching.
- Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL FILE package.
- Handled Json data and XML data.
- Done SQL/PLSQL performance tuning with Explain plan, DBMS PROFILER and TKProf
- Created KSH scripts to automate the housekeeping processes, reports with scheduled jobs and manual file transfer from one server to other.
- Prepared Maintenance requirement document and technical specification document as per client requirement.
- Created Unix KSH scripts for batch process and scheduled using Control M.
- Co-ordinated with testing team to analyze the defects and resolve the defects raised.
- Documentation of projects such as Functional Requirement Spec (FRS), ER-Diagram.
- Worked with application owners to automate/generate ADHOC reports for their study on business improvements and batch process with UNIX shell scripting.
- Developed Ezpay application using Oracle 11g, PL\SQL Developer, and TOAD.
- Worked on Account summary batch services. Created stored procedure, Packages, Tables, indexes, Arrays, Records, and Dynamic SQL etc.
- Responsible for the ADHOC changes in the database level.
- Designed Job Schedule information for batch jobs with Autosys scheduler
- Requirements gathering from business and prepare use case diagrams, business requirement documents such as BRD.
- Worked on preparation of project plan, functional specification document, PL/SQL code review, and test case creation.
- Adhered to timelines within the given schedule and delivered without slippage.