Potential issues in Pluto’s solution : for realizing Decentralization scholarly communication platform

in #blockchain7 years ago

Good day ! This is PLUTO team. 😃

Today, we want to share team’s concerns about potential issues of platform features.

For your understanding, we recommend that you read the previous post, which contains description about PLUTO’s solution, and then read this post.

Let’s take a look at potential issues that we are facing in each feature from now on. Possible issues are presented with bullet points for each feature.

0-B6R4bA9LQGWtYpk7.jpg

1. Research Achievement Dissemination

On the platform, all research information that has potential to add value to academic development can be considered a research achievements with the consent of the participants. In addition, these research information can be shared in various forms, and the authors have all rights on their own contents, thus setting the license policy on their own is the basic philosophy of sharing contents on PLUTO.

∎ As research achievements are expressed in various types and formats, detailed items of evaluation for each could differ as well. For example, if research achievement is shared in text, it may require assessment on its literature whereas animated formats may not. Therefore the team is concerned about having distinct evaluation items for different types and formats.

∎ Free access to content is possible during blind review period. So it may be meaningless to provide the author’s pricing option, due to the risk of abuse by making and using a copy at blind review period. As a result, the team is concerned about ways to protect the rights of authors, and another option is setting a platform-based guideline that allows all content to be published with open access.

2. Peer Review

Shared research achievements will initially receive blind review and will transition to public review period after certain conditions are met. The contributor who performs evaluation receives rewards in tokens and reputation. This process is a new model of peer review PLUTO has planned.

∎ Researchers may request retraction of contents that they shared. In this case, reviewers may not receive the rewards despite their contributions. Therefore, if an author asks for retraction during blind review period, we consider a policy where retraction can’t be made after more than N evaluations are made. Retractions after blind period might be decided through agreement of platform participants.

∎ Submitted contents may not meet the conditions for transition to public review period for a long time. This is a very important issue. We are considering ways to resolve content that lacks reviews by posting it at the top, or by creating a separate category. If you have a good idea, please comment.

∎ Researchers may want to pay additional tokens to reviewers. Therefore, we are thinking of a system that allows authors to set the default token m and the additional token M when deciding which tokens to provide to reviewers, and then distribute the M tokens themselves when requested by authors.

∎ We are considering whether there should exist a separate repositories for research achievements above and below certain criterion. because this would be especially significant for quality control if PLUTO has to compete existing journals with high impact factor for strategic user acquisition. If divisional repositories exist, we think that research achievements that exceed a certain standard should be made more widely available through digital library, and that research achievements below a certain standard achievements should be used for reference purposes only.

3. Researcher Social Network (On-demand Inquiries)

The basic concept is to set in advance bounty rewards in tokens, and an answer chosen by either the requester or the community receives the reward. Researchers can request for a verification of reproducibility, validation of data, proxy experiment, the data itself, or any other information relevant to a certain research through researcher social networking.

∎ Current research achievements are in the issue of lack of reproducibility. Therefore, the team believes that it will be a challenge to set the verification of the data validity and reproducibility of the study as the practice of academia.

∎ We are considering whether the requestor has option of choosing a bounty model. For example, the requester has option of choosing between i) no reward ii) reward distribution based on requestor decision iii) reward distribution based on community decisions. We are also considering ways to fix it with platform’s own policy.

∎ whether inquiries and answers can be set forth its copyright policies as well. Forcible open is more of a burden on participants and can take a defensive attitude, so we are thinking of an option to provide a copyright policy.

4. Researcher Reputation

Because having a reasonable reputation formula is one of core of the platform, reputation formulas must be progressively improved through the whole roadmap of developing the platform. Basically, reputation is based on the contributing activities on the platform and the number of votes representing endorsement from participants. Reputation weights the researcher’s intentions on the platform and is used as a measure of confidence.

∎ There is a concern about determination of initial reputation for newly registered users. Possible options include social proofs by existing users and external APIs for existing achievements outside PLUTO. Along with these issues, there are concerns on how to verify identifications(IDV) of researchers. This also may be solved with social proofs. Other options include IDV DApps such as Civic and uPort or projects like ID2020.

∎ There is a risk of exploitation because reputation formulas are open. We need a design that prevents this risk.

∎On the platform, Researcher’s reputation can be broken down into activities such as author, reviewer, and others. We are worried about the need to show a separate reputation for these types of activities.

∎ We are considering whether to have reputations for separate research fields. This seems quite challenging since clearly categorizing research into certain fields is a hard work, and interdisciplinary researches are increasing. This might be solved using community consensus on which research fields should be tagged to a research achievement or a researcher. In case of totally failing to discriminate reputations for separate fields, users might be given options of reputations when making an action, one for their major field of interest, and another for minor ones.

5. Resource Allocation

Platform reputation and research achievement evaluation score provide reliable and relevant indicators. So based on this, resources for research can be distributed through platform in a transparent and efficient manner. Such resource allocation includes but is not limited to management of research funds and decision of their grants, crowdfunding for research, recruitment or proposal for a joint research, renting of equipment and facilities, translations and proofreading services, and etc.

∎ Crowdfunding, in general, promises to reward the contributors proportional to the contribution. Research crowdfunding would require a whole new concept of incentivizing crowdfunds. Contributors might be given free copyrights over the contributed research.

∎ If the actual transaction, beyond the decision making, of resource allocation is to be made over the platform, a well-designed form of contract for each type of resources is required.

Your feedback will be a catalyst for innovating scholarly communication.
Thank you !