Sr. Application/data Developer Resume
Newark, DE
SUMMARY
- Almost 10 years of strong working experience in Oracle PL/SQL Software Development, SAS Programming development of various industries such as Banking, Insurance and Telecom domain.
- Experience in Bigdata/Hadoop development technologies PIG, HIVE, SPARK, AWS Cloud, HDFS, YARN, Map Reduce, SQOOP, KAFKA, HBase, Teradata, KUDU, Oozie, Scala, Hue
- Prior working experience in developing web applications using Java/J2EE.
- Strong experience in software development life cycle (Analysis, design, development, testing, implementation, and support) in both waterfall, Agile methodology & Data warehouse concepts, ETL and SCD.
- Hands on experience in database design, development, performance tuning, Data Integration and Data Migration.
- Expertise in PL/SQL, by developing and implementing DB objects such as Stored Procedures, Functions, Packages, Triggers, views, materialized view, cursors, ref cursors etc.
- Expertise in collections, SQL* Loader, dynamic SQL, and bulk collect techniques.
- Expertise in SAS/Base, SAS/Macro programming and debugging in UNIX environment.
- Experience in using various procedures in SAS such as Proc SQL, Proc Sort, Proc Freq, Proc Univariate, Proc Means, Proc Transpose and Proc Import etc.
- Experience in utilizing SAS Functions, Procedures, Macros, and other SAS application for data updates, data cleansing, and reporting.
- Experience in creating application jobs with Autosys scheduler & Control M and active monitoring.
- Good knowledge on logical and physical Data Modeling using normalization Techniques.
TECHNICAL SKILLS
Operating systems: UNIX, Linux, MS Windows
Methodology: Agile, Waterfall
Languages: Oracle PL/SQL, SAS/Base, SAS/Macros, Java J2EE.
Hadoop/Big Data Tools: PIG, HIVE, SPARK, AWS Cloud, HDFS, YARN, Map Reduce, SQOOP, KAFKA, HBase, Teradata, KUDU, Oozie, Scala, Hue.
Scripting Tools: Python, Shell
Databases: Oracle, ETL, Teradata, SQL Server, Hbase, DB2, MySQLTools SAS 9.4 Enterprise Guide, TOAD, SQL developer, SQL Loader, SQL*Plus, Clarify CRM 6.0, PuTTy, ServiceNow, Visio, JIRA.
Job scheduler: Cron tab, Autosys Scheduler, Control - M.
Version control & IDEs: Tortoise SVN, GIT- Bit bucket, Eclipse, Maven, SVN
PROFESSIONAL EXPERIENCE
Sr. Application/Data Developer
Confidential, Newark, DE
Responsibilities:
- Performing Oracle PL/SQL code changes for online account card portal.
- Performing creation and modification of objects such as Tables, Packages, Procedures, cursors, Ref cursors, views, Materialized views, DB links on business requirement changes.
- Creating the Integrity Constraints and Database Triggers for the data validations
- Loading data from flat file to oracle DB using External tables.
- Develop the Big Data applications by following Agile Software development life cycle for fast and efficient progress.
- Write software to ingest data into Hadoop and write scalable and maintainable Extract, Transformation and Load jobs.
- Create Scala/Spark jobs for data transformation and aggregation with focus on functional programming paradigm.
- Develop shell scripts and automated data management from end to end integration work.
- Develop Oozie workflow for scheduling and orchestrating the ETL process and worked on Oozie workflow engine for job scheduling.
- Experience handling Unix systems, for optimal usage to host enterprise web applications.
- Extensively used bulk collect with save exception clause to load/update large dataset.
- Getting done SQL/PLSQL performance tuning using Explain plan, DBMS Profiler.
- Write SAS procedures such as Proc SORT, PROC SQL, PROC PRINT, PROC IMPORT, PROC EXPORT and debugging errors in UNIX environment.
- Use LIBNAME and SQL PASSTHRU facility to read data from other DB sources like Oracle, Teradata.
- Working on password elimination in SAS files replacing with AUTHDOMAIN option to access Database using SAS metadata environment.
- Develop SAS macros and modifying pre-written macro to accommodate the changes on business logic when needed.
- Use SAS Hash object technique to integrate the data sets and lookups in data step to improve the performance.
- Create customized reports in SAS using SAS ODS facility.
- Perform UAT and get signoff from Product owner and Business user.
- Involve in creating Data Lake by extracting Big Data from various data sources into Hadoop HDFS. This included data from Excel, Flat Files, Oracle, Teradata and HBase.
- Involve in data migration from RDBMS (oracle, Teradata) to distributed solution with Hadoop Clusters using - Cloudera (CDH5).
- Design Data flow in Hadoop starting from Data ingestion to compression using SQOOP into HDFS and done conversion, transformation and partitioned using Impala, Hive.
- Design OOZIE workflow for data transfer between RDBMS (Oracle) and Hive.
- Store data from HDFS to respective Hive tables for further analysis to identify the Trends in data
- Setting up new and testing jobs in Autosys scheduler for SAS, Unix Shell Scripts.
- Create various sample dataset in testing environment &conduct UAT/Review for sign off.
- Performing production deployments for ADHOC code change as per business requirement.
- Working on ADHOC queries for various application reports generated on a daily/weekly or monthly basis
Tools: Oracle PL/SQL, Java, SAS, UNIX, Hadoop, HIVE, SQOOP, PIG, Spark, Yarn, Map Reduce, Hbase, Hue, HDFS, OOZIE, Shell Script, Tableau, Teradata, Autosys, etc.
Application Developer PL/SQL & SAS
Confidential
Responsibilities:
- Requirement analysis and design of models based on the discussion with application owners and business users.
- Developed SAS codes to generate user reports (standard data files) as per business requirements and conversion of excel reports to SAS datasets. Testing and debugging the same.
- Used LIBNAME and SQL PASSTHRU facility to read policies data from other sources.
- Produced quality customized reports using PROC REPORT, PROC TABULATE and SAS/ODS.
- Provided descriptive analysis using PROC MEANS, PROC FREQ.
- Created database objects like Tables, Views, Stored Procedures, Functions, Packages, DB triggers, Indexes.
- Used Bulk collect techniques (BULK COLLECT, FOR ALL, Save Exception, SQLBulk exception) to improve performance for loading and updating large data set with reduced context switching.
- Done SQL/PLSQL performance tuning with Explain plan, DBMS PROFILER,
- Created KSH scripts to automate the housekeeping processes, reports with scheduled jobs and manual file transfer from one server to other.
- Prepared Maintenance requirement document and technical specification document as per client requirement.
- Designed Job Schedule information for batch jobs with Autosys scheduler.
- Co-ordinated with testing team to analyze the defects and resolve the defects raised.
- Documentation of projects such as Functional Requirement Spec (FRS), ER-Diagram.
- Worked with application owners to automate/generate ADHOC reports for their study on business improvements and batch process with UNIX shell scripting.
Confidential
Application Developer PL/SQL & SASResponsibilities:
- Developed Ezpay application using Oracle 11g, PL\SQL Developer, and TOAD.
- Worked on Account summary batch services. Created stored procedure, Packages, Tables, indexes, Arrays, Records, and Dynamic SQL etc.
- Responsible for the ADHOC changes in the database level.
- Develop UIs with JSP, JavaScript, HTML and CSS.
- Wrote JSP and Servlets to add functionality to web application as a functionality change requirement.
- Created Unix KSH scripts for batch process and scheduled using Control M.
- Requirements gathering from business and prepare use case diagrams, business requirement documents such as BRD.
- Worked on preparation of project plan, functional specification document, PL/SQL code review, and test case creation.
- Adhered to timelines within the given schedule and delivered without slippage.