Empowering businesses with valuable data to create more inclusive AI

About Our Data

Our datasets focus exclusively on marginalised communities, ensuring that your AI models are trained on data that reflects the groups you aim to serve.
Whether you need insights on a specific demographic (e.g., West African millennial women) or a broader representation of diverse communities, we provide datasets that help you avoid and correct biases in your output.

Image & Video Datasets

Get high-quality image and video datasets designed for a wide range of computer vision applications. Our offerings include facial datasets (selfies, ID cards, expressions, occluded faces), full body shots, food, buildings and everyday objects.

These datasets are ideal for training and fine-tuning AI models in facial recognition, object detection, identity verification and other advanced computer vision tasks, ensuring accuracy and inclusion across various use cases.

Audio Datasets

Explore our collection of high-quality, multilingual audio datasets. We offer a diverse range of accurately transcribed speech data, including general conversations, scripted monologues and command-based audio.

Ideal for training and refining Automatic Speech Recognition, Conversational AI, Text-to-Speech and Voice Assistant models.

Each dataset has detailed metadata and transcriptions, ensuring seamless integration into your AI models.

Text Datasets

Explore our diverse conversational chat datasets, designed to enhance and fine-tune conversational AI models. Featuring rich, multi-turn dialogues across varied contexts and topics, these datasets are ideal for training chatbots, virtual assistants, and other NLP applications.

Whether you're focused on natural language understanding, sentiment analysis, or dialogue management, our datasets provide the comprehensive foundation you need to develop and improve response generation.

Our data addresses a critical need in the AI landscape.

As the industry faces a growing shortage of diverse and high-quality training data, Diverse Pics bridges the gap between underrepresented communities and the global tech ecosystem, enabling the development of more inclusive, representative and reliable AI models.

  • We provide the data necessary for AI systems to better recognise and understand diverse faces, voices and cultures, fostering a more inclusive technological future.

  • Our datasets help businesses address biases in training data, creating fairer algorithms and reducing systemic bias in AI systems.

  • By using data that represents a broader spectrum of demographics, you can significantly improve the accuracy and performance of your AI models.

  • Researchers can leverage our data to make meaningful advancements in fields like healthcare, education and social development, particularly for diverse populations.

How We Collect

By employing a dual approach—engaging directly with communities through partnerships with local leaders and providing an app for individuals to submit their data—we ensure a comprehensive and diverse dataset.

If you're a researcher or AI engineer committed to building fair, inclusive models, get in touch now and let's discuss your project.

  • We obtain explicit consent from each participant, clearly outlining the purpose and implications of data collection. We ensure that participants understand their rights and have the option to withdraw their consent at any time.

  • Our data collection process aligns with UNESCO's Recommendation on the Ethics of Artificial Intelligence. This global standard emphasises the importance of developing robust data governance strategies, including the continual evaluation of training data quality, adequate data security and protection, and privacy safeguards. We adhere to these guidelines to ensure transparency, fairness and the protection of individuals' rights throughout the lifecycle of AI systems.

  • We employ various anonymisation techniques to remove personally identifiable information from the data, protecting participant privacy while maintaining data integrity.