维基百科讨论:Guestbook for non-Chinese speakers/Archives/2015
Global renaming and abuselog
Hi, I noticed while performing a rename (nowadays only stewards & Global renamers handle these) that I couldn't move the userpage, blocked by abusefilter 49. Looking at the abuselog, it seems most of the users affected are global renamers or stewards, while performing a rename.Global renamers now skip the abusefilter when performing a global rename, but we, stewards, don't, when performing a local rename in order to finalise SUL.I wonder whether this filter can be either ammended, or where I should request that it'd be improved. Could, in the meantime, a local administrator move User talk:Bp0 to User talk:Bp0 (usurped) as part of my rename? With kind regards, Savh（留言） 2015年1月3日 (六) 09:52 (UTC)
- Hi, as AbuseFilter does not recognize global groups at all, I don't see an easy way for we to check for stewards. I thought there were bugs already on both adding support for global groups to AF and let global renames in general bypass AF (why was this not done for stewards?), but please feel free to poke some developers :) Jimmy Xu 论 2015年1月3日 (六) 13:21 (UTC)
Looking for feedback on my funding proposal to work with UNESCO
Firstly please excuse that this message is in English and if I have put this in the wrong place. I’m looking for feedback and endorsement for my Wikimedia Foundation PEG grant to be Wikimedian in Residence at UNESCO. I’d very much appreciate if you would have a look, I want to include as many different languages as possible and connect editors in each country with local UNESCO partners. The most relevant goals to Wikipedia are:
- 1. Train UNESCO and its partner organisations to contribute to Wikimedia projects: Provide UNESCO and its partners with the skills, tools, resources and connections to contribute to Wikimedia projects in a meaningful, measurable and sustainable way. To integrate into the Wikimedia community both online and by matching them with local Wikimedia organisations and volunteers for in person support and collaboration. The project will create and improve content receiving 100,000,000 views per year on Wikimedia projects, educate 1000 people in over 200 organisations to learn more about Wikimedia projects. This will include 500 newly registered users trained to contribute to Wikimedia projects and 500 articles formally reviewed by experts.
- 2. Make content from the archives of UNESCO and its partners available on Wikimedia projects: This project will facilitate the upload of 30,000 images, audio files, videos, data and other content to Wikimedia projects from UNESCO archives (24,000 images), UNESCO Institute for Statistics (UIS) and other sources including 10 organisations changing their content license to be Wikimedia compatible, a completed pilot project is outlined in the Goal section.
I ran a pilot project that resulted in the images found in the Wikimedia Commons category Images from the archive of UNESCO, here are a few examples relevant to Wikipedia:
If you think this is a worthwhile project please click this link and then click the endorse button.
Hello Dear Chinese, Please help me to translate the article Frank Christoph Schnitzler fom english Wikipedia in chinese wikipedia. Please start this article vor my chinese Frinds. I can no got english and no chinese. Pleas help us this article translate from en.Wikipedia in zh. Wikipedia. Thank you very much. I wish you a very god Time. Best Regards. Joe. —以上未加入日期时间的留言是于2015年1月31日 (六) 08:42 (UTC)之前加入的。
- @Weft：This small issue has been solved by User:GroverChouT. It's alright now. --Whaterss（留言） 2015年3月21日 (六) 05:08 (UTC)
关于管理员针对站出来 硬地音乐挑战赛提出的侵权，已于下方有标示文章活动脸书出处，本人为站出来硬地音乐挑战赛活动执行者之一，也是活动脸书页面管理员之一。 https://www.facebook.com/notes/%E7%AB%99%E5%87%BA%E4%BE%86-%E6%88%91%E5%80%91%E5%92%96%E5%95%A1%E5%BB%A3%E5%A0%B4%E8%A6%8B%E7%A1%AC%E5%9C%B0%E9%9F%B3%E6%A8%82pk%E8%B3%BD/%E5%BE%9E%E7%94%B2%E5%AD%90%E5%9C%92%E5%88%B0%E7%AB%99%E5%87%BA%E4%BE%86/143188299149194?comment_id=193862&offset=0&total_comments=4 引用文章是经过作者本人同意，并非侵权，由于首次使用维基，我该如何进入原文重新授权标示出处??再麻烦管理员们拨空回复，感谢!!--Blovecb（留言） 2015年4月16日 (四) 03:42 (UTC)Blovecb
Emission standards for trucks
Wiki labels & Revision Scoring as a Service for Chinese Wikipedia
Hello Chinese Wikipedia,
I apologize for my complete lack of Chinese skills. I would most welcome if my post is translated to Chinese.
So computers are very good at crunching numbers. Your average calculator can out smart you in arithmetic. However computers are terrible at pretty much in everything else. Programming computers to under take any task no matter how simple beyond computing tends to be very difficult. This is where Artificial Intelligence comes in. With Artificial Intelligence we teach computers how to solve problems without explicit programming for the solution. This is what we are doing.
We are working on a project called m:Research:Revision scoring as a service which aims to provide quality control Artificial Intelligence infrastructure for Mediawiki and Wikimedia projects. We already have our system implemented and running on Azerbaijani, English, French, Indonesian, Persian, Portuguese, Spanish, Turkish and Vietnamese editions on Wikipedia. We are hoping to adapt our tool to serve Chinese language as well as a number of other languages.
We are currently mainly focusing on vandalism detection where we provide an API (m:ORES) that provides scores. We have made an effort to keep our system robust.
The examples I'll provide are based on a machine learning algorithm that was trained to use 20,000 reverted edits. This is kind of modelling is problematic for two reasons. First is, there are non-vandalism related reasons for edits to be reverted such as mistakes from new users, this would develop such an unproductive bias. Second problem would be it lacks the ability to distinguish good faith users from malicious ones. To demonstrate our system I will give three examples from English wikipedia. I have picked these three semi-random.
- Score of 90% diff en:Moncef Mezghanni
- As visible in the diff, it is clearly something that shouldn't be welcome on English wikipedia. Algorithms confidence also matches my human assessment.
- Score of 75% diff en:Monin
- When I look at the diff it isn't immediately clear to me if this should be reverted. Detailed look reveals that prior version had more neutral information, but new version at a glance isn't exactly clear cut vandalism, albeit spammy. Algorithms confidence drops just as my human assessment.
- Score of 19% diff en:Curiosity killed the cat, but satisfaction brought it back
- As visible in the diff this edit clearly improves the article. The algorithms confidence plummets as well. Algorithm is more confident that this edit should NOT be reveted.
We are also working towards a system for article quality where we use existing assessment by en:Wikipedia:Version 1.0 Editorial Teamto train our system. We only have this system on English wikipedia at the moment but we would be more than happy to expand to other language editions. I am uncertain if Chinese Wikipedia has a similar quality assessment scale. I have picked 5 random articles to demonstrate this.
- Predicted: Start class (not even assessed) Perm link en:Maidenhead Advertiser
- Predicted: Stub class (actually marked Stub class) Perm link en:Joel Turrill
- Predicted: C class (actually marked stub class) Perm link en:Kajaanin Haka
- Predicted: C class (actually marked C class) Perm link en:Castell Arnallt
- Predicted: Featured class (actually marked Featured article) Perm link en:Hurricane Diane
Typical problem is that humans typically do not re-asses articles over time or articles are never assessed in the first place. Our system circumvents this problem by automating this.
We unfortunately lack language features such as bad words, informal words and stop words. This would be very helpful. We also need a localization of en:Wikipedia:Labels serving as our local landing page.
Once these are complete, we would like to start an edit quality campaign where we request the local community to hand code/label ~2000 revisions labeling them productive/damaging and good faith/bad faith. This would be similar to the campaign on English Wikipedia en:Wikipedia:Labels/Edit quality.
After this we will be able to generate scores for revisions that is usable by gadgets such as ScoredRevisions as well as (potentially) tools like huggle. If community desires it, it can even be used to create a local vandalism reversion bot.
So in a nutshell our algorithm relies on community input to support the community. Feel free to ask any questions. Either here, on meta or on IRC on the freenode server and #wikimedia-ai channel where we hang out. You can also reach us at https://github.com/wiki-ai
- User:Jianhui67 do you think you can help with this? -- とある白い猫 chi? 2015年8月7日 (五) 20:03 (UTC)
- Awesome! Just in last few days, much of WikiProjects were set up in Chinese Wikipedia, and a large number of talk pages were tagged unassessed banner by bots, just like this (1,067 unassessed article out of 1,369). Plus, a very very very large number of pages even not tagged a banner. As one who also focus on assessment, I believe It's a very useful tool for auto-assessing Stub, Start and C-Class articles. --CAS222222221 2015年8月8日 (六) 06:46 (UTC)
Wikilabels localization, final few things
Hello all, we have concluded the older campaigns for English, Portuguese, Persian and Turkish which have concluded recently which was partially why we had this gap. So we are very close in launching the edit quality campaign for this wiki as well. All we need is the translation of the relevant entry on m:Wiki labels/Interface translation and m:Wiki labels/Interface translation/Edit quality. We are very excited to expand our work to include this wiki and can start the campaign as soon as we have the two pages translated. Thanks! -- とある白い猫 chi? 2015年10月11日 (日) 17:19 (UTC)
- One last thing, one thing we do is we auto label revisions we think are likely good these include revisions that are not reverted in a while and revisions that were made by users with higher access (such as sysop). What user groups aside from sysop are "trusted" on this wiki? User groups I see are: autoreviewer, bot, bureaucrat, checkuser, confirmed, flow-bot, ipblock-exempt, oversight, patroller, rollbacker, sysop. -- とある白い猫 chi? 2015年10月22日 (四) 22:06 (UTC)
Help on a feedback
I need some help concerning this feedback posted by Suchichi02. I really want to answer his question and I prefer to ask for help, which would be more comfortable for both of us if someone help us. :)
Does anyone can help?