Who we are:
We’re a team of dedicated engineers committed to unlocking the power of data, by extracting and transforming unstructured web data into clean, and ready-to-use data for our clients. An integral part of our offerings is large-scale web crawling and extraction using cloud computing and distributed technologies. We’re on a quest for innovative ways to solve the business problems of data acquisition and normalization on the web. Our vision is to make PromptCloud a one-stop brand for data and our growth is geared towards that.
We have a collaborative and inclusive work environment where individuals can grow their professional careers by working with some of the most advanced technologies and talented team including company founders.
We’re looking for energetic, innovative thinkers who share our vision – and our commitment to our technology leadership and customer success.. If you’re looking for a challenging career with a successful market leader that still has the excitement and hunger of a startup, you should join us as long as you have a passion for technology and are willing to put in an effort to grow further.
Where we are at PromptCloud:
We are a bootstrapped company in the mid-growth phase and are planning to quickly expand (not much in terms of personnel) but heavily with respect to the solutions and global coverage of our clients. Multiple local and global brands are our clients, more details here: (PromptCloud).
As a Sr. Software Engineer with PromptCloud, you will be responsible for building large-scale web crawling setups that work on our proprietary infrastructure which churns out hundreds of millions of data records every month. To handle data at this scale, we use cutting-edge open source technologies for large-scale data processing like Hadoop, Spark, Redis, MySQL, RabbitMQ, NoSQL Databases and various cloud services like AWS, GCE depending upon specific needs. We are not tied to one technology. Instead, we use what is best suited for the purpose. All of our systems tend to be loosely coupled, communicating using synchronous and asynchronous messaging, leading to a classic large scale distributed data processing architecture.
Work is generally technically deep and demanding, however, that will lead you to have hands-on experience in the latest data crawling and processing technologies and will be working on some truly exciting projects with exceptional colleagues.
Responsibilities and expectations