dji-aerial-georeferencing
poor-mans-vr
dji-aerial-georeferencing | poor-mans-vr | |
---|---|---|
8 | 3 | |
163 | 28 | |
6.7% | - | |
3.2 | 7.0 | |
8 months ago | 7 months ago | |
JavaScript | JavaScript | |
Apache License 2.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
dji-aerial-georeferencing
-
DJI drone flight log viewer
How does this compare with the data you get from Airdata?
When I was working on an aerial georeferencing project[1] I found that the on-device flight logs didn’t contain all the info in the serverside logs (eg heading, gimbal orientation, GPS coords, altitude)
[1] https://github.com/roboflow/dji-aerial-georeferencing
-
Container Terminal’s Satellite imagery processing
You can also do it if you know the height and field of view (I open sourced code to do this from drone videos here: https://blog.roboflow.com/georeferencing-drone-videos/ )
-
Calculate angular distance between two pixels
Yeah that’s pretty much it. I wrote about something similar for translating pixel coordinates from drone videos to GPS coordinates: https://blog.roboflow.com/georeferencing-drone-videos/
-
Finding & plotting solar panels from drone videos
I'm not the OC (that'd be u/aloser). Based on the blog post breakdown (https://blog.roboflow.com/georeferencing-drone-videos/), it's a recorded video matched up with the flight log to identify the location of certain objects.
I published the code here: https://github.com/roboflow-ai/dji-aerial-georeferencing
-
Show HN: Finding and plotting solar panels in drone videos using computer vision
Had a lot of fun building this; did an in-depth writeup here as well: https://blog.roboflow.com/georeferencing-drone-videos/
poor-mans-vr
-
Ask HN: Show me your half baked project
https://github.com/muxamilian/poor-mans-vr
A poor man's VR: Using the front camera and tensorflow.js, the smartphone becomes a “window” into the real world. Video and image content appear as if they were seen through this window. To do this, the viewer’s position is determined using a neural network. The viewed content is then moved according to the viewer’s position. This makes it seem like the content is physically behind the smartphone and is viewed through the smartphone’s screen. This effect is especially useful for content captured using an ultra-wide lens.
- A poor man’s VR (front camera and tensorflow.js)
- Show HN: A poor man’s VR (front camera and tensorflow.js)
What are some alternatives?
afterflight - An application for analysis of UAV log and video
ai-deadlines - :alarm_clock: AI conference deadline countdowns