Abstract: The founders of social media platforms established them in the hope that users would use these platforms for economics and entrepreneurship, raising the level of social interaction, and giving parties the opportunity to talk. On the other hand, these platforms can be inappropriate for discussions such as; Terrorism and Islamic fundamentalism, homosexuality and abuse and violence against women - which are harmful to the government and users - are inappropriate. However, platforms that are essentially intermediaries between users and are not responsible for republishing content should reconsider their responsibilities with self-regulation; Restrictions, deletions, filtering and flagging are such cases. On the other hand, in restricting users, the aspects of freedom of speech, observance of moderation, leaving the task of editing content to neutral people should be considered, and users should not cause trouble for the owners of social networks.
Key word: Platforms, social media, Social Networks, governance of platform, users
Archetti, C. (2015). Terrorism, communication and new media: Explaining radicalization in the digital age. Perspectives on Terror-ism, 9(1).
Ardia, D. S. (2010). Free speech savior or shield for scoundrels: An empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loyola of Los Angeles Law Review, 43(2), 373–506
Armijo, E. (2013). Kill switches, forum doctrine, and the First Amendment’s digital future
Balkin, J. (2004). Digital speech and democratic culture: A theory of freedom of expression for the information society. New York University Law Review, 79, 1–55
Balkin, J. M. (2014). Old school/new school speech regulation. Harvard Law Review, 127(8), 2296–2342
Baym, N. K., & Boyd, dana. (2012). socially mediated publicness: An introduction. Journal of Broadcasting & Electronic Media, 56(3), 320–329.
Boyd, dana. (2011). Social network sites as networked publics: Affordances, dynamics, and implications. In Z. Papacharissi (Ed.), A networked self: Identity, community, and culture on social network sites (pp. 39–58). New York: Routledge
Bruns, A., & Burgess, J. (2015). Twitter hashtags from ad hoc to calculated publics. In N. Ram-bukkana (Ed.), Hashtag publics: The power and politics of discursive networks (pp. 13–28) New York: Peter Lang.
Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14(7), 1164–1180.
Cardozo Arts & Entertainment Law Journal, 32, 411–469.
Citron, D. K. (2014). Hate crimes in cyberspace. Cambridge, MA: Harvard University Press.
Couldry, N., & van Dijck, J. (2015). Researching social media as if the social mattered. Social Media + Society, 1(2), 2056305115604174.
Crawford, K., Gillespie, T. (2016). What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media & Society 18 (3), 410–428.
DeNardis, L., & Hackl, A. M. (2015). Internet governance by social media platforms. Tele-communications Policy, 39(9), 761–770.
DeNardis, L., & Hackl, A. M. (2015). Internet governance by social media platforms. Tele-communications Policy, 39(9), 761–770.
Forsyth, H. (2016). Forum. In B. Peters (Ed.), Digital keywords. Princeton, NJ: Princeton University Press.
Gerlitz, C., & Helmond, A. (2013). The like economy: Social buttons and the datainten-sive web. New Media & Society, 15(8), 1348–1365.
Gillespie, T. (2015). Platforms intervene. Social Media + Society, 1(1), 2056305115580479.Ginsburg, J. C. (1995). Putting cars on the ‘information superhighway’: Authors, exploiters, and copyright in cyberspace. Columbia Law Review, 95(6), 1466–1499.
Gillespie, T. (2015). Platforms intervene. Social Media + Society, 1(1), 2056305115580479
Godwin, M. (2003). Cyber rights: Defending free speech in the digital age. Cambridge, MA: MIT Press.
Grimmelmann, J. (2015). The virtues of moderation: Online communities as semicommons. Yale Journal of Law and Technology, 17(42).
Grimmelmann, J. (2015). The virtues of moderation: Online communities as semicommons. Yale Journal of Law and Technology, 17(42).
Humphreys, S. (2013). Predicting, securing and shaping the future: Mechanisms of governance in online social environments. International Journal of Media & Cultural Politics, 9(3), 247–258
Kreimer, S. F. (2006). Censorship by proxy: The First Amendment, Internet intermediaries, and the problem of the weakest link. University of Pennsylvania Law Review, 155(1), 11
Langlois, G. (2013). Participatory culture and the new governance of communication: The paradox of participatory media. Television & New Media, 14(2), 91–105
Lessig, L. (1999). Code and other laws of cyber-space. New York: Basic Books.
Litman, J. (1999). Electronic commerce and free speech. Ethics and Information Technology, 1(3), 213–225.
MacKinnon, R. (2012). Consent of the net-worked: The worldwide struggle for Internet freedom. New York: Basic Books.
MacKinnon, R., Hickok, E., Bar, A., & Lim, H. (2014). Fostering freedom online: The roles, challenges and obstacles of Internet intermediaries. New York: United Nations Educational
MacKinnon, R., Hickok, E., Bar, A., & Lim, H. (2014). Fostering freedom online: The roles, challenges and obstacles of Internet intermediaries. New York: United Nations Educational.
Mann, R. J., & Belzley, S. R. (2005). The promise of Internet intermediary liability. William & Mary Law Review, 47, 239–308.
Matias, J. N., Johnson, A, Boesel, W. E., Keegan, B., Friedman, J., & DeTar, C. (2015). Reporting, reviewing, and responding to harassment on Twitter. Women, Action & the Media. Retrieved from womenac-tionmedia.org/twitter-report/
Matias, J. N., Johnson, A., Boesel, W. E., Keegan, B., Friedman, J., & DeTar, C. (2015). Reporting, reviewing, and responding to harassment on Twitter. Women, Action & the Media. Retrieved from womenac-tionmedia.org/twitter-report/
Meyerson, M. (2001). The neglected history of the prior restraint doctrine: Rediscovering the link between the First Amendment and the separation of powers. Indiana Law Review, 34(2), 295–342
Milan, S. (2015). When algorithms shape collective action: Social media and the dynamics of cloud protesting. Social Media + Society, 1(2), 2056305115622481.
Mueller, M. L. (2015). Hyper-transparency and social control: Social media as magnets for regulation. Telecommunications Policy, 39(9), 804–810.
Mueller, M. L. (2015). Hyper-transparency and social control: Social media as magnets for regulation. Telecommunications Policy, 39(9), 804–810
Obar, J. A., & Wildman, S. (2015). Social media definition and the governance challenge: An introduction to the special issue. Telecommunications Policy, 39(9), 745–750.
Postigo, H. (2009). America Online volunteers: Lessons from an early co-production com-munity. International Journal of Cultural Studies, 12(5), 451–469.
Reagle, J. (2015). Reading the comments: Likers, haters, and manipulators at the bottom of the web. Cambridge, MA: MIT Press.
Roberts, S. T. (2016). Commercial content moderation: Digital laborers’ dirty work. In S. U. Noble & B. Tynes (Eds.), Intersectional Internet: Race, sex, class and culture online. New York: Peter Lang
Roth, Y. (2015). ‘No overly suggestive photos of any kind’: Content management and the policing of self in gay digital communities. Communication, Culture & Critique, 8(3), 414–432.
Sandvig, C. (2015). The social industry. Social Media + Society, 1(1), 2056305115582047.
Shepherd, T., & Landry, N. (2013). Technology design and power: Freedom and control in communication networks. International Journal of Media & Cultural Politics, 9(3), 259–275.
Shirky, C. (2008). Here comes everybody: How change happens when people come together. New York: Penguin Press.
Stein, L. (2013). Policy and participation on social media: The cases of YouTube, Facebook, and Wikipedia. Communication, Culture & Critique, 6(3), 353–371.
Thompson, J. B. (2005). The new visibility. Theory, Culture & Society, 22(6), 31–51.
Vaidhyanathan, S. (2011). The Googlization of everything (and why we should worry). Berkeley, CA: University of California Press
van Dijck, J. (2013). The culture of connectivity: A critical history of social media. Oxford: Oxford University Press. Varnelis, K. (Ed.). (2008). Networked publics. Cambridge, MA: MIT Press.
Varnelis, K. (Ed.). (2008). Networked publics. Cambridge, MA: MIT Press
Wagner, B. (2013). Governing Internet expression: How public and private regulation shape expression governance. Journal of Information Technology & Politics, 10(4), 389–403.
Weltevrede, E., Helmond, A., & Gerlitz, C. (2014). The politics of real-time: A device perspective on social media platforms and search engines. Theory, Culture & Society, 31(6), 125–150
mohammad hosaini,E. (2021). Regulation of and by Platforms (The challenge of cyberspace legislation; who is responsible for publishing inappropriate content). Society Culture Media, 10(39), 245-260.
MLA
mohammad hosaini,E. . "Regulation of and by Platforms (The challenge of cyberspace legislation; who is responsible for publishing inappropriate content)", Society Culture Media, 10, 39, 2021, 245-260.
HARVARD
mohammad hosaini E. (2021). 'Regulation of and by Platforms (The challenge of cyberspace legislation; who is responsible for publishing inappropriate content)', Society Culture Media, 10(39), pp. 245-260.
CHICAGO
E. mohammad hosaini, "Regulation of and by Platforms (The challenge of cyberspace legislation; who is responsible for publishing inappropriate content)," Society Culture Media, 10 39 (2021): 245-260,
VANCOUVER
mohammad hosaini E. Regulation of and by Platforms (The challenge of cyberspace legislation; who is responsible for publishing inappropriate content). Society Culture Media, 2021; 10(39): 245-260.