-
sitcom-simulator-cli
Discontinued A tool that combines GPT-3, Stable Diffusion, and FakeYou to create fully automated video. [Moved to: https://github.com/joshmoody24/sitcom-simulator]
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
NetVendor
Finds everything on a network from a Cisco (etc) IP ARP file - Great for benchmarking networks
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Indeed. Here's the code and here's the channel (the code isn't very easy to set up yet, I'll be working on making it more developer friendly in the near future)
So I built a gitlab pipeline to create a backup of this upstream db without downtime using various SQL utils. This archive is then staged into an image so the data will unpack and load on startup. I then used a data subsetter called condenser to create datasets for certain use cases. Now devs can load reliable dev data quicker, test against data that QA uses but within their unique envs (local and preview) and create datasets for their own use cases.
Needing to benchmark what was in my network I built NetVendor that takes network router output called an arp table and turns into actionable data of what exactly is on the network
Talked to a friend and he told me the same, now we are building an open source project to sync data from any external API to your local DB:
Now, the reason for the asterisk in the last section is that it is obviously a little more complicated than simply hardcoding the news sources into the very software itself. OSINTer works by collecting information and then store it in a database, until it's needed by the CTI researcher, which means that when scraping websites we want to be able to filter out unnecessary information and clutter like ads, layout specific sections and other parts of the website which isn't directly related to the news story. To do this, OSINTer makes use of a Domain Specific Language or DSL created by me. All of that sounds rather fancy, but what it all translates to is that OSINTer takes in a series of files in a simple and structured JSON format, which describes which websites to scrape, and which parts of these websites to keep. This also means that it is not only very fast to add new news-sources (approx 5 mins) but also that if OSINTer where to be used for trend research in a completely different area, it would be possible to switch out these JSON files and have OSINTer collect some completely different data. Within the context of OSINTer, are this DSL called profiles and can be found at https://gitlab.com/osinter/profiles
It didn't really require much in the way of 3rd-party modules. Here's the requirements.txt. There's pysimplegui in there for reviewing and approving uploads. I guess technically requests isn't built in. There's a module for working with git. watchdog watches for new files. pydub for audio file conversions.