Database Expert Resume
QuincY
PROFESSIONAL SUMMARY:
- 8 Years of IT Experience as Database Expert with experience in PL/SQL in Oracle, ETL tools like Informatica and Talend.
- Expert in creating complex SQL Queries, Analytical and aggregate function, Sub Queries, Hierarchal Queries, PL/SQL Packages, Functions, Stored procedures, Exception handling, transaction control and xml generation. Created database objects using Oracle tools like SQL*Plus, SQL Developer and Toad.
- Excellent working knowledge of Oracle database architecture (column orientation, projections, pivoting, segmentations, partitions, high availability and recovery, security)
- Use of materialized views, Index Organized Tables, virtual Private Database, indexes, partitions and applying various hints for tuning queries that requires processing billions of data.
- SQl loader for direct and parallel load of data from raw file to database tables.
- Matching and Merging the data based on various matching rules.
- Use of Analytical and Aggregate functions for performing various operations in SQL.
- Proficient in advance features of Oracle 11g for PL/SQL programming like Using Records and Collections, Bulk Bind, Ref. Cursors, Nested tables and Dynamic SQL.
- Use efficient Joins Indexes, bulk operations, and materialized views for Query Optimization.
- Using Explain Plan to analyze tables and apply Oracle hints for Performance Tuning.
- Virtual private database and fine grain auditing to implement a layer of security on data.
- Creating meta files in Unix for validating raw files layout and load them in tables using talend jobs. Use of cronjob to schedule the daily and monthly jobs. Use of Unix shell script to invoke the talend jobs. File transfer from one server to another using scp.
- Make use of tmap, tfilelist, tfile input delimited etc. in talend to validate and load external files csv,xml,raw files into Oracle database.
- Calling shell scripting from tsystem component in talend for validating and cleaning the data.
- Created many jobs in talend to call procedures, functions of Oracle.
- Use of Pipelined functions and Result cache in PL/SQL. Use of Rollback segment in oracle stored procedures.
- Use of tac to call various jobs created in the talend studio. Also run the talend job on server and refer log files to see the status of job.
- Use of Talend components for data cleaning, data validation, data mapping, data loading into Oracle objects.
- Cron job for scheduling the jobs.
- Oracle Enterprise manager for monitoring the jobs.
- Use of WinSCP, File zilla for file transfer from one server to other.
- Transferring files to Mailbox through FTP and SFTP.
- Worked on Power Center Tools like designer, workflow manager, and workflow monitor and repository manager.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Transaction Control, Update Strategy, Union, Lookup, Joiner and Stored procedure transformations.
- Implemented SCD1, SCD2 type maps to capture new changes and to maintain the historic data.
- Developed Unix shell scripts to schedule Informatica sessions.
- Implementation experience in relational and dimensional modeling, star/snowflake schema
- Working knowledge on Data replication, Data Integration, High Availability, Master Data Management and Data Governance
- Proven experience architecting, designing and implementing low latency near real time systems and working with 50+ TB data sets.
- Use Jil files and applyIM to install changes in development and production environment.
- Use Oracle golden gate for migrating the data from MDM to CDE.
- Have proven experience with ETL technologies and large data movements techniques
- Change Data Capture (CDC) and data integration expertise
- Adopted agile methodologies and involve in the daily scrum meeting. Work on sprint created by scrum master in JIRA.
- Strong experience in Data Modeling. Use Toad data modeler 3.x to design LDM and PDM.
- Worked on Oracle Database Upgrade and SQL Server decommissions.
- Extensively use of Macros in Excel.
- Worked extensively on Stash, Team Foundation Server and clear case for version control and merging code from development to master.
- Documentation of projects for Business Requirement Spec(BRS), Functional Requirement Spec (FRS), Use case Spec, DFDs, Test Plan, Test Script & Test Cases.
TECHNICAL SKILLS:
Languages: SQL, PL/SQL, Unix Shell Scripting
Oracle Tools: SQL Developer 4.1, TOAD 10.6, SQL* Plus, Oracle Golden gate
ETL Tools: Talend Studio 5.3,Informatica 9.x, SQL *Loader
Databases: Oracle 10g/11g.
Operating System: Windows 7/XP/NT/Vista, UNIX,LINUX
Data Modeling Tools: Toad Data Modeler 3.6, Erwin 7.x, MS Visio 2013, MS Powerpoint
Server: Unix, Informatica
Other Tools: TAC,File Zilla,Stash, Clearcase, Autosys, Cognos Reporting,,WINSCP, Toad 10.6, Toad Data Modeler 3.6,Autosys,Jira, TFS,VSS, Putty, IIS, MS - Excel 2007/2010/2013,CodeGen,Tivoli,JRS,HPQC,EQUIP,PRISM
ETL: Datastage, PL/SQL
Programming Languages: Oracle PL/SQL, UNIX shell scripting
Database: Oracle 11g Exadata, DB2, Oracle 10g, Oracle 9i
Job Schedulers: Autosys, ControlM
Other Tools: Datastage Designer 8.5, 11.5TOAD, IBM Datastudio
Putty, WinSCP, SQL Developer:
Operating Systems: UNIX, Linux, Windows
Version Tools: SCCS, Clearcase, Harvest
Agile Tools: RTC
Ticketing Tools: SNOW, HP helpdesk, ISM, Element Tool, Remedy.
PROFESSIONAL EXPERIENCE:
Confidential, Quincy
Database Expert
Responsibilities:
- Identifying the technical approach and providing the solution through high level technical documents, estimation.
- Design, data modeling and development of foundational FDR components used for staging the upstream data replicated through QREP, load through Datastage, using rule framework
- Create Packages and pipelined table function to query and produce the results for various inputs.
- Create various objects and collections to store the results of various views.
- Use of dynamic SQL in various functions in packages.
- Tuning the queries based on explain plan and applying various hints based on cardinality and indexes for the faster execution.
- Gather statistics on the large tables using DBMS GATHER STATS.
- Create roles and privileges and granting access to tables, views based on roles.
- Create multiple schemas for classifying database objects and processes and providing different grants and access across schemas.
- Lead the development efforts at onsite, unit testing and systems integration testing.
- Work as a release coordinator for Production installs. Create implementation plan for monthly release and meeting invite for install review.
- Creating analyze scripts for new tables. Create dry run and main profile for deploying code in test and production environment.
- Take pre-install and post install snapshot of all required DB objects in database. Check out the partition maintenance for new tables.
- Creating Packages and pipelined table function to query and produce the results for various inputs.
- Creating various objects and collections to store the results of various views.
- Use of dynamic sql in various functions in packages.
- Creating database objects like tables, views, procedures, packages, sequences, cursors, triggers etc using Oracle Utilities like PL/SQL, SQL* Plus and Handled exceptions.
- Working knowledge of Oracle database architecture (column orientation, projections, pivoting, segmentations, partitions, high availability and recovery, security)
- Expert in creating complex SQL Queries, Sub Queries, correlated queries using various aggregation and analytical functions.
- Use of Case statement, decode functions, Analytical functions such as rank(), dense rank(), row number() etc for calculating aggregations at different levels.
- Tuning the queries based on explain plan and applying various hints based on cardinality and indexes for the faster execution.
- Gathering statistics on the large tables using DBMS GATHER STATS.
- Creating log tables, job and job dependency tables to run the batch jobs and store logging information.
- Use of Win SCP for file transfer from one server to local and vice versa.
- Practice Agile methodologies, attend daily scrum meeting and assign task on RTC.
Confidential
Sr. Oracle Database Developer
Responsibilities:
- Creating database objects like tables, views, procedures, packages, sequences, Cursors, triggers etc using Oracle Utilities like PL/SQL, SQL* Plus and Handle Exceptions.
- Excellent working knowledge of Oracle database architecture (column orientation, projections, pivoting, segmentations, partitions, high availability and recovery, security)
- Expert in creating complex SQL Queries, Sub Queries, correlated queries using various aggregation and analytical functions.
- Use of Case statement, decode functions, Analytical functions such as rank(), dense rank(), row number() etc for calculating aggregations at different levels.
- Creating materialized views, table functions, collections, Index Organized Tables, bitmap and btree indexes, range and list partitions on the tables.
- Exporting and Importing schema statistics from one database to other database for optimization.
- Monitoring the query performance using Oracle Enterprise Manager.
- Tuning the queries based on explain plan and applying various hints such as parallel, fast full scan, append etc for the faster execution.
- SQl loader for direct and parallel load of data from raw, csv file to oracle database table
- Gathering statistics on the large tables which are updated frequently using DBMS GATHER STATS.
- Applying dynamic sql in stored procedures using DBMS SQL package or Execute Immediate statement.
- Matching and Merging the customer related information through different stored procedures based on various matching rules as provided by client.
- Performing bulk operations such as bulk update, bulk insert into tables and handle exceptions if any.
- Designing logical and physical data model with database keys and constraints using Erwin data modeler as mentioned in File Interface Agreement.
- Creating design document and data flow diagrams for developing new process in the system.
- Implementing Virtual private database with fine grained access control for security of data.
- Creating roles and privileges and granting access to tables,views based on roles.
- Creating multiple schemas for classifying database objects and processes and providing different grants and access across schemas.
- Creating log tables, job and job dependency tables to run the batch jobs and store logging information.
- Providing Tier3 Production support for the batch jobs running overnight.
- Use of Pipelined functions and Result cache in PL/SQL. Use of Rollback segment in stored procedures.
- Use Talend components for data cleaning, data validation, data mapping, data loading into Oracle objects.
- Calling shell scripting from tsystem component in talend for validating and cleaning the data.
- Creating meta files in unix for validating raw files layout and load them in tables using talend jobs. Use of cronjob to schedule the daily and monthly jobs. Use of unix shell script to invoke the talend jobs. File transfer from one server to another.
- Make use of tmap, tfilelist, tfileinputdelimited, tJava etc. in talend to validate and load external files csv,xml,raw files in Oracle database.
- Created many jobs in talend to call procedures, functions of Oracle.
- Use of tac to call various jobs created in the talend studio. Also run the job on tac server and view log files to see the status of job.
- Use of Win SCP, File zilla for file transfer from one server to other. Transferring files to client Mailbox through FTP and SFTP.
- Use cron scheduler to schedule daily, weekly and monthly jobs.
- Creating excel, pdf reports using Cognos Reporting tool.
- Practice Agile methodologies, attend daily scrum meeting and assign task on Jira.
Environment: Oracle database 10g/11g, Talend 5.3, SQL Developer 4.x,Toad 11.x, Toad Data Modeler 3.6,Autosys,Filezilla,Putty,Jira, Unix Server, MS Visio, ERWIN, Winscp, Oracle Enterprise manager 12c
Confidential, Merrimack, New Hampshire
Senior Oracle PL/ SQL Developer
Responsibilities:
- Work on Data contribution by creating database objects like tables, views, procedures, cursors, triggers, types, sequences, packages, functions using Toad and SQL Developer.
- Carried out Data adoption by tuning complex SQL queries using Explain Plan, Indexes ,Collections, and Bulk Operations. Use Partitions, Oracle Hints for optimal query performance.
- Created conceptual, logical and physical model and review it with data analysts and DBA’s to ensure adherence to the standards.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner and Stored procedure transformations.
- Worked on Power Center Tools like designer, workflow manager, and workflow monitor and repository manager.
- Implemented SCD1, SCD2 type maps to capture new changes and to maintain the historic data.
- Worked on Matching, merging, consolidation of data through Informatica MDM using MDM hub.
- Scheduling Data integration job using Autosys and Control-M.
- Co-ordinate with UI developers to call AIPs through their services.
- Work on utplsql for testing.
- Work on migration of data from CDE to ODS to Warehouse using Informatica workflows and mappings.
- Use of Jil files and applyIM to install changes in development and production environment.
- Oracle golden gate for migrating the data from MDM to CDE.
- Use of stash for version control, merging changes from development folder to master folder.
- Adopted agile methodologies and involve in the daily scrum meeting. Work on sprint created by scrum master in JIRA.
- Make use of Putty to login to Informatica and Unix server.
- Code generator to generate the packages of similar kinds.
Environment: Oracle database 10g/11g, Informatica 9.1.0, SQL Developer 4.x,Toad 10.6, Toad Data Modeler 3.6,Autosys,Stash,Jira, Unix Server, MS Visio, Informatica Server, Oracle Golden Gate.
Confidential
Oracle PL\ SQL Developer
Responsibilities:
- Work with business stakeholders, application developers, and production teams and across functional units to identify business needs and discuss solution options
- Created database objects like tables, views, procedures, packages using Oracle Utilities like PL/SQL, SQL* Plus and Handled Exceptions
- Excellent working knowledge of Oracle database architecture (column orientation, projections, pivoting, segmentations, partitions, high availability and recovery, security)
- Implementation experience in relational and dimensional modeling, star/snowflake schema
- Extensively use Unix Shell Scripting for creating scripts to call Oracle stored procedures and functions.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner and Stored procedure transformations.
- Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
- Worked on extract and load type of mappings.
- Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
- Developed Unix shell scripts to schedule Informatica sessions.
- Used Tivoli Job Scheduler for scheduling and monitoring jobs and its successor. Setting the various jobs in Production and UAT environment.
- Resolving abended jobs using application log file.
- Performance tuning, troubleshooting and bug fixing of PL/SQL scripts while working closely with the UI developers.
- Maintaining quality related documents like Project Plan, PSSP, DR-BCP, RAT Register etc.
- Extensively used Erwin 7.x for data modeling and Dimensional Data Modeling
- Production support for the backend database and reporting applications.
- HPQC for logging defects and executing test cases.
- Carried out extensive data validation with client's data specialist validating complex SQL queries.
- Discussions with business groups about the data quality results and future steps to develop business rules to ensure efficient data quality check and validation.
- Designed the documentation that involves functional specification, user Manual and technical review.
Environment: Oracle database 10g/11g, PL/SQL, SQL, Informatica 8.x,Erwin 7.x, Telnet, Unix Shell Scripting, Toad, Team Foundation Server, Tivoli, JRS,HPQC.