Sql Developer Resume
2.00/5 (Submit Your Rating)
SUMMARY
- Around 12+ years of IT industry experience on Data warehousing in Telecom and Banking Applications.
- Extensive Experience on Bigdata, Teradata, Mainframe for analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverables with in committed timelines.
- Skilled in understanding HADOOP phases and deep understanding of Hadoop ecosystem Implementation methodologies.
- Experience in Business Analysis and Data Analysis, User Requirement Gathering, User Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
- Experienced in installing, configuring, supporting and managing CDH, Cloudera On Data Lake.
- Experience in implementing TLS/SSL And MIT/AD Kerberos Authentication mechanism.
- Extensive experience with development, testing, debugging, implementation, documentation and production support.
- Experienced in deploying Proof of Concept (POC).
- Strong hands on experience using Teradata utilities (SQL, B - TEQ, Fast Load, MultiLoad), Teradata parallel support and Unix Shell scripting.
- Proficient in coding of optimized Teradata batch processing scripts for data transformation, aggregation and load using BTEQ.
- Strong experience in Creating Database Objects such as Tables, Views, Indexes in Teradata.
- Created tables and views based on the layout sent by Clients.
- Sound Knowledge Tivoli schaduler, creating schadule and job definations and job schaduling and monitoring.
- Experienced in IBM Mainframe and worked as developer.
- Designed and modeled Data Marts as per business requirements.
- Worked extensively in Development of large Projects with complete END to END participation in all areas of Software Development Life Cycle and maintain documentation.
- Quick adaptability to new technologies and zeal to improve technical skills.
- Good analytical, programming, problem solving and troubleshooting skills.
TECHNICAL SKILLS
Operating system: UNIX, OS/390,Windows
Language: Cobol, JCL,Teradata SQL, Scoop, Java
Database: Teradata,DB2,IMS, HDFS, Hadoop, Hive, Scala, Oracle, Cassandra
OLTP: CICS
Tools: & Utilities: VSAM,SPUFI,Fileaid,Panvelet,Changeman,Quality centere,Service now,Teradata SQL Assistant,Informatica, Tivoli,ESP menu, Autosys
Environment: TSO/ISPF.
Other Tools: Service Now, JIRA, Deployement tool, Splunk, Datastax, Netcool, Tablue, Hue, Devops, Jenkins
PROFESSIONAL EXPERIENCE
Confidential
SQL Developer
Responsibilities:
- Involved in cluster planning, capacity planning and Hadoop phases Parcel/package repository creation.
- Deploying cloudera Hadoop proof of concept (POC) from scratch.
- Implementing security modules in POC environment like enabling TLS/SSL to CM and CDH components, Kerberos integration with AD/MIT, RBAC using sentry, HDFS ACL, log/query Redaction.
- Responsible for JDK, CDH and Cloudera Manager upgrade, Commissioning and decommissioning nodes, Enabling name node/resource manager HA.
- Manage services, roles, hosts and rack awareness using Cloudera Manager, Troubleshooting application failures.
- YARN scheduler queue management and Log analysis.
- Application/Services/Servers monitoring in Production using Autosys, Cloudera Manager.
- FIX providing for Failed Jobs.
- Production deployment for release activity using UBS deployment tool, JIRA
- Emergency FIX, weekend Deployments and support.
- Infra week end patching, monthly patching, quarterly patching.
- Identifying automation area and deploy automation in production.
- Debugging and performance tuning on application and database level.
- Direct client communication and gathering the requirements and production implementation.
- Team handling and mentoring new joiners.
- Client meetings and co-ordination, Billing activites, Team work assignements .. etc
Confidential
SQL DeveloperResponsibilities:
- Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.
- Created Logical Data flow Model from the Source System study according to Business requirements on MSVisio.
- Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships.
- Consistency of definitions of Data Attributes and Primary Index considerations.
- Preparing HLD, DD.
- Involved in cluster planning, capacity planning and Hadoop phases Parcel/package repository creation.
- Deploying cloudera Hadoop proof of concept (POC) from scratch.
- Implementing security modules in POC environment like enabling TLS/SSL to CM and CDH components, Kerberos integration with AD/MIT, RBAC using sentry, HDFS ACL, log/query Redaction.
- Responsible for JDK, CDH and Cloudera Manager upgrade, Commissioning and decommissioning nodes, Enabling name node/resource manager HA.
- Manage services, roles, hosts and rack awareness using Cloudera Manager, Troubleshooting application failures.
- YARN scheduler queue management and Log analysis.
- Application/Services/Servers monitoring in Production using Autosys, Cloudera Manager.
- FIX providing for Failed Jobs.
- Coding, Unit testing.
- SIT, UAT and Live implementation.
- TWS coding and implementation.
- Fixing bugs in Development and Live.
Confidential
SQL Developer
Responsibilities:
- Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.
- Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
- Worked on optimizing and tuning the Teradata views and SQL s to improve the performance of batch and response time of data for users.
- Worked closely with analysts to come up with detailed solution approach design documents.
- Provided initial capacity and growth forecast in terms of Space, CPU for the applications by gathering the details of volumes expected from Business.
- Prepared low level technical design document and participated in build/review of the BTEQ Scripts, Multiloads and Fast Load scripts.
- Verifiedif implementation is done as expected i.e. check the code members are applied in the correct locations, schedules are built as expected, dependencies are set as requested.
- Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
- Provided quick production fixes and proactively involved in fixing production support issues.
- Involved in complete software development life-cycle(SDLC) including requirements gathering, analysis, design, development, testing, implementation and deployment.
- Developed technical design documents (HLD and LLD) based on the functional requirements.
- Coordinate with Configuration management team in code deployments.
- Involved in cluster planning, capacity planning and Hadoop phases Parcel/package repository creation.
- Deploying cloudera Hadoop proof of concept (POC) from scratch.
- Implementing security modules in POC environment like enabling TLS/SSL to CM and CDH components, Kerberos integration with AD/MIT, RBAC using sentry, HDFS ACL, log/query Redaction.
- Responsible for JDK, CDH and Cloudera Manager upgrade, Commissioning and decommissioning nodes, Enabling name node/resource manager HA.
- Manage services, roles, hosts and rack awareness using Cloudera Manager, Troubleshooting application failures.
- YARN scheduler queue management and Log analysis.
- Application/Services/Servers monitoring in Production using Autosys, Cloudera Manager.
- FIX providing for Failed Jobs.