A 23-year-old man facing first-degree murder charges in the deaths of four people in Markham, Ont., appears to have sent online messages about planning to kill his family several months ago, CBC News has learned.
A user known as Menhaz said in March he was “gonna kill [his] parents and go to jail yo” in a private message on Discord, a chat program used by hundreds of millions of video game players around the world, according to a server administrator.
Canada “has no death penalty so I might be dicked but no death,” the user added.
It is the second set of messages to emerge from Discord about the killings. The same user appeared to describe the killings just hours before police were called to a Markham home on Sunday and found four bodies — raising questions about who bears responsibility to act when someone sends violent messages online.
York Regional Police arrested Menhaz Zaman at the scene and charged him on Monday with four counts of first-degree murder. Police said the victims were three women and one man. Friends and neighbours say Zaman lived in the home with his family.
CBC News has seen screen captures of the March messages which were sent to a player of Perfect World Void — an online role-playing game akin to World of Warcraft — by the user known as Menhaz.
CBC News has also seen screen captures of the messages sent early Sunday, in which a user on the same account said they had “slaughtered” their “entire family.” Those messages were sent to at least two people who often played online with the user known as Menhaz.
Those two people told CBC News that because they’ve known him online for years, they know the user to be Menhaz Zaman.
CBC News has agreed to protect their identities and has not independently confirmed that the user Menhaz is the same person charged in the deaths.
The messages sent on Sunday were accompanied by several extremely graphic photos of dead bodies and bloody weapons. CBC News has not been able to confirm they are from the Markham crime scene.
One of the people who provided the messages to CBC News is an administrator of a private server for the game. He provided the screen grabs of the March messages, which he said were sent to another user.
The administrator said in an email that no one on the server initially saw the user as a threat.
“People would see this as a joke (although I’m sure that several players would have asked for details, even if it was a joke),” he wrote.
“[We] will surely be more strict towards similar acts out of players, as in we will be censoring more words, specifically words that suggest murder or harm, and will approach players who seem actually depressed,” the administrator wrote.
‘We’re not mental health professionals’
Just who bears the responsibility to flag issues like these on the internet is complicated, said Richard Lachman, associate professor at the RTA School of Media at Ryerson University.
“We’re not mental health professionals,” Lachman said about people in the gaming community.
“If one is a well-meaning person and someone on social media is putting out what you might interpret as a cry for help, the best we can do is try and link that person with professional resources.”
The onus is also on companies that run these services to adequately monitor their platforms, Lachman said, but that’s a complicated proposition as well, he said.
“It’s not easy to say we want companies to deal with this. It might be incredibly expensive to do. Companies want to move to an algorithmic way, to try to do this in an automated way — maybe by scanning for words, or trying to identify things in video posts.”
Discord said in an emailed statement that the company is “shocked and appalled by this tragic event,” and “working closely with law enforcement to provide any assistance we can.”
The statement also said that Discord community guidelines to which all users must adhere, which prohibit threatening messages or any illegal activity.
“We investigate and take immediate action against any reported violation by a server or user, which can include shutting down offending servers or banning users,” the company said.
Lachman said if society wants a wholesale change of online speech, it will likely require more human eyes alongside an “algorithmic solution.”
“And if we’re not willing to implement those solutions, maybe we have to do without those services,” he said.
York Regional Police would not answer questions about how the service receives tips from online platforms.
“This is an ongoing investigation as well as a case before the courts,” said Const. Andy Pattenden in an email. “Our only anticipated update on this case is the release of the names of the victims and the cause of death once these have both been determined by the Coroner.”