Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Toxicity Alternatives
Similar projects and alternatives to toxicity
-
zotero
Zotero is a free, easy-to-use tool to help you collect, organize, annotate, cite, and share your research sources.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
Hasura
Blazing fast, instant realtime GraphQL APIs on your DB with fine grained access control, also trigger webhooks on database events.
-
Zulip
Zulip server and web application. Open-source team chat that helps teams stay productive and focused.
-
PostHog
🦔 PostHog provides open-source product analytics, session recording, feature flagging and A/B testing that you can self-host.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
trivy
Find vulnerabilities, misconfigurations, secrets, SBOM in containers, Kubernetes, code repositories, clouds and more
-
Metabase
The simplest, fastest way to get business intelligence and analytics to everyone in your company :yum:
-
Lean and Mean Docker containers
Slim(toolkit): Don't change anything in your container image and minify it by up to 30x (and for compiled languages even more) making it secure too! (free and open source)
-
Fleet
Open-source platform for IT, security, and infrastructure teams. (Linux, macOS, Chrome, Windows, cloud, data center) (by fleetdm)
-
seldon-core
An MLOps framework to package, deploy, monitor and manage thousands of production machine learning models
-
hate-speech-and-offensive-language
Repository for the paper "Automated Hate Speech Detection and the Problem of Offensive Language", ICWSM 2017
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
toxicity reviews and mentions
-
Perhaps It Is a Bad Thing That the Leading AI Companies Cannot Control Their AIs
I'm a PM at a human data company (https://www.surgehq.ai) that helps the large language model companies ensure their models are safe (we're the “clever prompt engineers” who helped Redwood assess their model performance).
We actually just published a blog today that includes our perspective on building “AI red teams” and best practices for AI alignment/safety: https://www.surgehq.ai/blog/ai-red-teams-for-adversarial-tra...
-
30% of Google's Emotions Dataset Is Mislabeled
I'd love to chat. Want to reach out to the email in my profile? I'm the founder of a much higher-quality data startup (https://www.surgehq.ai), and previously built the human computation platforms at a couple FAANGs.
We work with a lot of the top AI/NLP companies and research labs, and do both the "typical" data labeling work (sentiment analysis, text categorization, etc), but also a lot more advanced stuff (e.g., training coding assistants, evaluating the new wave of large language models, adversarial labeling, etc -- so not just distinguishing cats and dogs, but rather making full use of the power of the human mind!).
-
Building a No-Code Toxicity Classifier – By Talking to GitHub Copilot
> Rather than operating under a strict definition of toxicity, we asked our team to identify comments that they personally found toxic.
[0]: https://github.com/surge-ai/toxicity
-
Ask HN: Who is hiring? (January 2022)
Love language? So do we, and our mission is to infuse AI with that same love. At Surge, we're building the human infrastructure to power NLP — from detecting hate speech, to parsing complex documents, to injecting human values into the next wave of language models. Our first product is a platform that helps ML teams create amazing, human-powered datasets to train AI in the richness of language. We're a team of former Google, Facebook, and Airbnb engineering leads, and we work with top companies at the forefront of machine learning. Our tech stack is Ruby on Rails, React, and Python. We’re rapidly growing, and we're looking for full-stack engineers to join the team and develop our product. To apply, please email [email protected] with a resume and 2-3 sentences describing your interest in Surge. We love personal projects and writings too!
More information: https://www.surgehq.ai/about#careers
A blog post explaining the problems we are working to solve: https://www.surgehq.ai/blog/the-ai-bottleneck-high-quality-h...
- The Toxicity Dataset – building the largest free dataset of online toxicity
- [Free] The Toxicity Dataset — building the world's largest free dataset of online toxicity [Github]
- The Toxicity Dataset — building the world's largest free dataset of online toxicity
- The Toxicity Dataset (1000 social media comments) — any ideas for interesting visualizations? [github]
- The Toxicity Dataset - free dataset of online toxicity (Github) - could be used for interesting portfolio projects
- The Toxicity Dataset — free dataset of online toxicity (Github)
-
A note from our sponsor - InfluxDB
www.influxdata.com | 23 Apr 2024
Stats
surge-ai/toxicity is an open source project licensed under MIT License which is an OSI approved license.
Sponsored