Role & responsibilitiesDesign, implement, and manage/deploy OpenStack infrastructure, ensuring high availability and performance.Develop and maintain deployment scripts and tools using Python and relevant frameworks.Integrate OpenStack with AWS and Kubernetes for hybrid cloud solutions.
Job descriptionDevelop and maintain web applications.Collaborate with teams to design, develop, and deploy new features.Debug and troubleshoot issues.Strong problem-solving skills and attention to detail.
Job descriptionStrong knowledge of Front-End Technologies (HTML5, CSS3, JavaScript, and jQuery)Basic understanding in responsive layout and designFamiliarity with Python, Node.js, and React.js would be an added advantageSolid understanding of programming conceptsWillingness to learn new tools and technologies
Responsibilities:Develop and maintain high-quality scalable and robust back-end components for AI-based applications.Collaborating on feature design and implementation.Implement and optimize data storage solutions, RESTful APIs, and server-side logic.
Job descriptionProject Role :Software Development EngineerProject Role Description :Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work.
Description:5+ years of experience in software development, with at least 2+ years of hands-on experience working with the Kore.ai platform.Proven expertise in developing chatbots and virtual assistants using Kore.ai tools.Proficiency in programming languages such as JavaScript or other scripting languages
Job descriptionMust have 3+years of relevent experience on RPA development.Must have 3+ years of relevent experience on Blue prism.Any programming language experience is preffered like Python or Abbayy.
Job descriptionWe are looking for a highly motivated full stack data scientist to join our dynamic team. The ideal candidate would be passionate about extracting insights from complex data developing end-to-end data solutions building automation and process improvement tools and contribute to the success of the organization.
Required Technical and Professional ExpertiseDesign, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems.Implement data quality and validation processes within Ab Initio.
SkillsElectrical circuits: principles of current, voltage, and resistance.Understanding of electronic circuits, digital signals, and RF (Radio Frequency) principles.Knowledge of Wi-Fi 6E, Bluetooth, 5G, NB-IoT, GPS, and UWB.
Key Responsibilities:Develop and maintain scalable Python-based solutions for data analysis, processing, and modeling.Leverage Pandas and NumPy libraries to manipulate, transform, and analyze large datasets efficiently.Design and implement data pipelines, ensuring data integrity and optimal performance.
Job description- Should have 7+ years of exp in Big data with Java or Python.-Exp with hadoop, Kubernetes, docker