The centre's Digital Terrorism and Hate Project gave Twitter a grade of "B" in a report card of social networking companies' efforts to fight online activity by militant groups such as IS.
"We think they are definitely heading in the right direction," the project's director, Rabbi Abraham Cooper, told Reuters in a telephone interview ahead of Monday's release of the report card at a press conference in New York.
He said the review was based on steps that Twitter has already taken and information that centre staff learned in face-to-face meetings with company representatives.
Islamic State has long relied on Twitter to recruit and radicalize new adherents. The Wiesenthal Center, an international Jewish human rights organization, has been one of toughest critics of Twitter's strategy for combating those efforts.
Some vocal Twitter critics have tempered their views since December, when the site revised its community policing policies, clearly stating that it banned "hateful conduct" that promotes violence against specific groups and would delete offending accounts.
Researchers with George Washington University's Program on Extremism last month reported that Islamic State's English-language reach on Twitter stalled last year amid a stepped-up crackdown by the company against the extremist group's army of digital proselytizers.
Last year, the centre gave Twitter a grade of "C" in a report card that covered efforts to fight terrorism along with hate speech. This year it gave two grades, awarding Twitter a "D" on hate speech, saying the company needed to do more to censor the accounts of groups that promote hate.
A Twitter spokesman declined comment, but pointed to a statement on the company's blog posted Feb. 5 on combating violent extremism.
"We condemn the use of Twitter to promote terrorism and the Twitter Rules make it clear that this type of behaviour, or any violent threat, is not permitted on our service," Twitter said in the blog.
Among other major Internet firms included in this year's survey, Facebook Inc got an "A-" for terrorism and a "B-" for hate.
Cooper said Facebook "understood" the gravity of the issue before most companies, set up a team of monitors worldwide to catch the posts in question and created technological fixes to prevent extremists from creating new accounts.
Facebook did not immediately respond to a request for comment.
Meanwhile, Alphabet Inc's YouTube got a "B-" for terrorism and a "D" for hate due to what Cooper said is a "reactive" response to videos posted rather than a proactive approach to keeping them off the site.
He pointed to a video published last year on YouTube by the Al Shabbab fundamentalist group that listed Mall of America in Minnesota as a potential attack site. He said that video was pulled from YouTube after several hours.
YouTube declined to comment.
© Thomson Reuters 2016