sdcv VS peerreview

Compare sdcv vs peerreview and see what are their differences.

peerreview

A diamond open access (free to access, free to publish), open source scientific and academic publishing platform. (by danielBingham)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
sdcv peerreview
2 7
280 51
- -
1.9 8.8
11 months ago 14 days ago
C++ JavaScript
GNU General Public License v3.0 only GNU Affero General Public License v3.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

sdcv

Posts with mentions or reviews of sdcv. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-09-16.

peerreview

Posts with mentions or reviews of peerreview. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-06-28.
  • Request for Feedback: An open-source, open-access, community governed academic publishing platform that crowdsources review using reputation
    2 projects | /r/AskAcademia | 28 Jun 2023
    Hey everyone, I'm an experienced software engineer from an academic family. I've been aware of the problems in academic publishing for most of my life, and for the last several years I've been running headlong into the paywalls as I work on municipal policy advocacy. I've been pondering software solutions to this problem for a long time. This is exactly the sort of problem internet based software is, in theory, best suited to solving: sharing and discussing information. It should be possible to build a web platform that allows academia to share work, collect feedback, organize review that maintains quality, and find relevant papers with out relying on private, for-profit journal publishers. It should be possible to build and run a web platform that handles all of academic publishing for 1% of the current cost of for-profit publishing or less - which would (in theory) allow the universities to keep it funded while allowing it to be free to publish and free to access. Hell, it could probably be run lean enough that individual academics could fund it through small dollar donations. There's really no good reason to allow the private publishers to charge academia $11 billion a year while keeping 80% of the work locked behind paywalls. I've had several ideas for how to approach the problem, and I spent the last year building out a beta of one of them as a side project. Software development is experimental and iterative. It only works when the developers are able to get active feedback from the people most effected by the problems they are trying to solve. So I'm reaching out for feedback on the beta, and on possible paths forward. The web platform that I've built enables crowdsourced peer review of academic papers. It uses a reputation system (similar to StackExchange) and ties reputation to a field/concept tagging system. Submitted papers must be tagged with 1 - n fields, and only peers who have passed a reputation threshold in one of the tagged fields may offer review. Review is also split into two phases: pre-publish and post-publish. Pre-publish review is author driven. It's focused on collaborative, constructive feedback and uses an interface heavily inspired by both Github Pull Requests and Google Docs. Post-publish review is much closer to traditional review, and is focused on maintaining the integrity of the literature by filtering out spam, misinformation, fraud, and poorly done work. Reputation is mostly gained and lost through voting that happens during post-publish review. Reputation can also be gained by offering particularly constructive pre-publish reviews. All reviews are open and published alongside the papers. Post-publish review is on-going. That's iteration one. As much as I believe review could be crowdsourced, it seems pretty clear that going straight from what we have to this platform would be a huge leap. So I have ideas for how to build a journal overlay on top of the crowdsourced review system that would allow editors to manage teams of reviewers and run their journals through the platform. This would allow them to take advantage of the review interface, and would still give authors the benefit of being able to have a conversation with their reviewers. Authors would then be able to choose to submit their papers to one or more journals, crowdsourced review, or both. Building that out is the next project. Right now I'm working on this as a side project and an experiment -- could a web platform like this work? Would people even use it? If the answer turns out be yes, I'd love for it to become a non-profit, multi-stakeholder cooperative. Essentially independent public infrastructure similar to Wikipedia, only more transparent and more clearly democratically governed. I would love feedback on all aspects of this project - both the current crowdsourcing iteration and the thought to build a generic, open platform for diamond open access journals to run their operations through. Could you ever see yourself using something like this to publish? What about to collect pre-print review? Could you see yourself reviewing through it? What about submitting to journals through it? Are there other approaches to building a web platform that might work better? Am I barking up the wrong tree? Should I press forward, abandon, or is there a better tree? You can find the beta platform here: https://peer-review.io The source here: https://github.com/danielbingham/peerreview And more details about exactly how it works (in its current iteration) here: https://peer-review.io/about Maintaining an open roadmap here: https://github.com/users/danielBingham/projects/6/views/1
  • Show HN: Scientific publishing platform to crowdsource review using reputation
    2 projects | news.ycombinator.com | 28 Jun 2023
  • Millions of dollars in time wasted making papers fit journal guidelines
    5 projects | news.ycombinator.com | 8 Jun 2023
  • Request for Feedback: Peer Review - Open Source, Open Access Scientific Publishing Platform drawing on Github and StackExchange
    2 projects | /r/Open_Science | 5 Jun 2023
    And the source code here: https://github.com/danielbingham/peerreview
  • Open-Source Science (OSSci) to launch interest group on reproducible science
    1 project | /r/Open_Science | 5 Jun 2023
    Last summer I finally saved up enough runway to take some time off work and put a significant amount of time into building an MVP beta of it ( https://peer-review.io, https://github.com/danielbingham/peerreview ). I've been trying to find folks interested in trying it out and exploring whether it could work.
  • Show HN: Peer Review Beta – A universal preprint+ platform
    1 project | news.ycombinator.com | 25 Apr 2023
    Hey HN,

    I've been working on Peer Review for the past year. It's still in early beta (pre-0.1) but I'm looking for some early adopters to start putting it through its paces and help highlight areas I should focus on.

    Peer Review is an idea I've had for years. You're probably well aware of the problems involved in academic, scientific, and scholarly publishing - HN certainly discusses them enough. Peer Review is my attempt to solve them (or a subset of them).

    Peer Review combines features of Github and StackExchange to allow scholarly review to be crowd sourced to a trusted pool of peers. It does this by tying reputation to a hierarchical field tagging system. Reputation gained in children is also gained in the parents. Authors tag their papers with any fields they feel are relevant.

    This means authors can tag their papers with fields higher up the hierarchy to cast a wider review net, or go lower down the hierarchy to cast a narrower one. It also enables cross-discipline review and collaboration very easily - authors simply tag their papers with the fields of both disciplines.

    The review interface combines aspects of Github PRs and Google docs.

    Review is split into two phases: pre-publish "review" focused on giving authors constructive critical feedback to help the improve their work and post-publish "refereeing" which looks more like traditional peer review and is the primary mechanism through which new authors gain reputation.

    The whole site is built around the idea that scholars are working to collectively build the body of human knowledge and make it the best they can.

    You can see the production site here: https://peer-review.io

    You're welcome to explore the staging site and treat it as a sandbox, if you'd like: https://staging.peer-review.io

    It's open source: https://github.com/danielbingham/peerreview

    I'm doing all the development in the open as much as possible. If it gains traction, the plan is to form a non-profit around it and explore whether a web platform can be governed democratically as a multi-stakeholder cooperative and if we can solve some of the issues around large centralized platforms through that governance approach.

  • Ask HN: What interesting problems are you working on? ( 2022 Edition)
    29 projects | news.ycombinator.com | 16 Sep 2022
    I'm working open source and would welcome contributions! (https://github.com/danielbingham/peerreview)

    (Although, the first contribution would probably need to be getting the local working again in a new context... I've been going fast and taking on some techdebt that will need to be paid down soon.)

What are some alternatives?

When comparing sdcv and peerreview you can also consider the following projects:

jyut-dict - A free, open-source, offline Cantonese Dictionary for Windows, Mac, and Linux. Qt, SQLite. C++ and Python.

reals - A lightweight python3 library for arithmetic with real numbers.

qolibri - Continuation of the qolibri EPWING dictionary/book reader

typst - A new markup-based typesetting system that is powerful and easy to learn.

Console - The new Windows Terminal and the original Windows console host, all in the same place! [Moved to: https://github.com/microsoft/terminal]

danielBingham

Windows Terminal - The new Windows Terminal and the original Windows console host, all in the same place!

KeenWrite - Free, open-source, cross-platform desktop Markdown text editor with live preview, string interpolation, and math.

ConEmu - Customizable Windows terminal with tabs, splits, quake-style, hotkeys and more

tone - tone is a cross platform audio tagger and metadata editor to dump and modify metadata for a wide variety of formats, including mp3, m4b, flac and more. It has no dependencies and can be downloaded as single binary for Windows, macOS, Linux and other common platforms.

glslViewer - Console-based GLSL Sandbox for 2D/3D shaders

beets - music library manager and MusicBrainz tagger