20K22676 ???????

??????. ? [1] [2] [3] ???????????????????????. ????????????????????????????. ???????????? ...







????????????????????????
I. ??. ??????????????????????. ???????????????????????. ????????????? 1??????. (1) ????? ...
???? N2O Emissions from Vehicles Equipped with Three-Way ...
??????????????????????????????? ????. ??????????????????????????????????? ...
La négociation des contributions dans les wikis publics : légitimation ...
Template Deletion (T-D), and Template Modification (T-M), ... Specifically, we used two of the multi-label classifier implemented in Mulan (Tsoumakas, Katakis, ...
Edit Categories and Editor Role Identification in Wikipedia
We study hoaxes in the context of Wikipedia, for which there are two good reasons: first, anyone can insert information into Wiki- pedia by creating and editing ...
Impact, Characteristics, and Detection of Wikipedia Hoaxes
Wikis are ubiquitous in organisational and private use and provide a wealth of textual data. Maintaining the currency of this textual data is important and ...
TS Wikipedia Corpus - LDC Catalog
ABSTRACT. Wikipedia is commonly viewed as the main online encyclope- dia. Its content quality, however, has often been questioned.
Statistical Measure of Quality in Wikipedia
An n-gram in turn is a substring of n tokens of t, where a token can be a character, a word, or a part- of-speech (POS) tag. The Term Frequency ? Inverse ...
Identifying Featured Articles in Spanish Wikipedia - SEDICI
... character set ... Word processors or HTML. Markdown was created by John Gruber in 2004 and is the default mechanism for docu- menting ...
GitHub Wiki Design and Implementation
It introduces the most relevant definitions and the related work for the research fields of semantic relatedness, named entity recog- nition, word sense ...
Utilising Wikipedia for Text Mining Applications - SciSpace
Model that uses both local (exact matching of n- grams of characters) and distributed (word embeddings) representations to compute a relevance score (Mitra ...
Design and Implementation of the Sweble Wikitext Parser
It presents the de- sign and implementation of a parser for Wikitext, the wiki markup language of MediaWiki. We use parsing expres- sion grammars where most ...
Cross-domain Text Classification using Wikipedia
Abstract?Traditional approaches to document classification requires labeled data in order to construct reliable and accurate classifiers.