User Requirements for the Social Networking Solution
Security of social websites is of paramount importance in these times. The database of websites like Facebook and Twitter contain personal data of the users. Data protection and privacy must be top priority in these websites.
Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper
I would recommend five methods for website protection:
- HTTPS: This protocol provides a privacy assurance to the users that the website they are visiting is authentic and no other party is observing the traffic between them and the web server (Durumeric et al., 2017). The website of Verbania Incwould contain accounts of users and they will be sharing their private data with the website. Having this certification would benefit the company by establishing their authenticity. This certificate can be obtained free of cost from Let’s Encrypt, which is an online community that automatically generates and provides HTTPS certificates.
- Uploaded files: Social website users would upload media files like pictures to the website. Picture files have enough space to contain a PHP code that could auto execute itself to open a back door to the web server and steal data (Olsson, 2016). Here, I would strongly recommend that data must not be stored on the same server. The data must be kept on an entirely different server so that even in the case of a successful hack on the web server; there will be no leak of user data. The files uploaded by the users can be kept on the web server a dummy database on the web server, which can then be stored to the data base server after rigorous and through scanning for script codes. Script codes are generally hard to detect by conventional means and thus it is better to keep the files away from the data base server until they have clearance from the security team.
- DDoS Attacks: A DDoS or a Distributed Denial of Service attack is a type of website attack where the web server is spammed with fake traffic to disrupt the website’s services. Verbania Inc would be hosting online games on their website and thus such an attempt might take place. Here, I would recommend that the website must use traffic redirector services, as it would reduce the rate of direct traffic to the server and thus minimize the risk of a DDoS attack (Wang et al., 2015). Verbania Inc would be hosting online multiplayer games and thus it is likely that the servers might face DDoS attacks on a regular basis by attackers who just want to crash the website just for fun. Sometimes the servers are overloaded by requests from authentic users continuously spamming games. These can be minimized by putting a timer between each successful game sessions and thus reducing the rate of internet traffic. This also promotes safe game playing as the gamers are bound to take breaks between each game session (Chung et al., 2016).
- Account Passwords: Passwords for each online account must be creative and not hackable by traditional means. During the account-creating phase, the users must be made aware that the password they are selecting must contain alphabets and numbers. This will minimize the chances of successful dictionary style and brute force attacks. These types of attacks take a huge amount of computational power and time. Highly complex passwords thus exponentially increases the amount of time taken to crack them. The passwords that are saved to the data base server must be securely encrypted. Using Secure Hash Algorithm and adding salt to the passwords would be my recommendation for here for Verbania Inc(Guesmi et al., 2016). Encrypting passwords using SHA before storing them to the data base server would be helpful in the event that the password database is stolen from the server. Such encryption is impossible decrypt if the hackers do not possess the key. Passwords can be further encrypted using random salt values in between each individual password before storing those (Asimi et al., 2016). Such salt values are impossible to remove by any third party. Only the person with the knowledge of the specific salt values that were implemented can decrypt them.
- Website Security Testers: Before running the websites on live servers, we should test them for any flaws in security and improve the security by performing rigorous penetration testing in a beta environment. There are certain tools that we can use for beta testing the servers. Netsparker is a tool that we can use for testing the server on SQL injection and Cross-site Scripting (XSS) attacks (Appelt et al., 2016). These attacks are executed by script codes, which are easy to upload and hard to detect. SQL and XSS attacks are mainly executed to steal data from the server. Thus, even a single flaw in the system would be enough for a successful attack. Xenotix XSS Exploit Framework can also be used to simulate XSS attacks on the beta server (Gupta & Gupta, 2017). This open source tool contains a massive collection of Cross-site Scripting attacks that would help us to simulate any such attack on the server and thus rectify those flaws before going live. A customisable and automated security scanner can be used to test the vulnerabilities on a scheduled basis. We can use OpenVAS for this purpose as it a free and open to all software.
Apart from website security, there can be times when the server might face some downtime for maintenance or it might be recovering from a data loss event due to an attempted hack. I would recommend the following steps to recover from any disaster whether it is man-made or natural:
- Backup: A website hack can sometimes corrupt or even remove critical files from the server. The risk of data loss can be minimized with regular and automated back-up procedures. Daily back-ups of the website and the database is my recommendation.
- Code 503 Status Page: There may be some downtime in the web server or the data base server due to any number of reasons such as an attempted hack, maintenance or even server upgrade. During any of those cases, the website should show a status code 503. This conveys the message to the site visitors that the website is facing downtime and regular service would be ensured after a specific amount of time. This also lets Google know that there is some problem with the website. I would recommend this disaster management method as Google might remove the URL from their index if this code is not displayed on the website during downtime. However, this method is only useful if the downtime is short but if the downtime is long and the website is still showing the 503 code, the website would be considered as permanently unavailable according to Google’s algorithm.
- Mapping of old URLs over to new URLs: The website’s URL will definitely change after resolving of downtime. Any downtime is usually resolved by either system upgrade or changes to the database. The URL of the website will be different after such an event. Thus, here my recommendation would be to use a 301 Redirect method to map the website’s old URL to the new one so that the users do not get confused (Donovan & Feamster, 2014).
- Redundant Servers: Natural disasters can sometimes be a cause to server downtime. The facility hosting the web server and the data base server for Verbania Inc must have back-up servers in different locations to minimize the loss suffered from natural disasters (Colman-Meixner et al., 2014). It is highly unlikely that all the servers would be facing downtime at the same time due to a natural disaster.
- Google Cloud Platform: Recovery is based on either time spent or data amount to be recovered. The cost to both of these processes can be drastically reduced with the help of Google Cloud Platform (Greenly et al., 2015). Application and storage complexities are also reduced. Data that is backed up on a cloud server can be recovered at any time and from any place. Google has a global network that can be accessed from anywhere in the world. The data is stored in multiple locations and as a result, there is no loss of data. The data security offered by Google is also highly reliable. Creating backups to cloud helps to reduce the cost of hardware and software. Almost every organization especially social website creators use cloud service as their servers are attacked on a regular basis. According to me, Google Cloud SQL can be used to create a distributed database on MySQL (Krishnan & Gonzalez, 2015). It contains the benefits of both MySQL as well as Google Cloud Platform. The security and reliability of these services taken together means that the contents of the database can be recovered any time in the event of a website crash.
I am offering these recommendations to the security specialist of our team. The security of social websites like Verbania Inc is very complex and security experts have to implement multi-layer security protocols, as any attacker needs only one-entry point to turn an attempted hack to a successful one. Thus to conclude, I am privy to the IT policies on security and disaster recovery. All the recommendations that I have made in this document strictly abide those policies.
Reference List
Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper
Appelt, D., Nguyen, C. D., Briand, L. C., & Alshahwan, N. (2014, July). Automated testing for SQL injection vulnerabilities: an input mutation approach. In Proceedings of the 2014 International Symposium on Software Testing and Analysis (pp. 259-269). ACM.
Asimi, Y., Amghar, A., Asimi, A., & Sadqi, Y. (2016). New Random Generator of a Safe Cryptographic Salt Per Session. IJ Network Security, 18(3), 445-453.
Chung, U. S., Han, D. H., Shin, Y. J., & Renshaw, P. F. (2016). A prosocial online game for social cognition training in adolescents with high-functioning autism: an fMRI study. Neuropsychiatric disease and treatment, 12, 651.
Colman-Meixner, C., Dikbiyik, F., Habib, M. F., Tornatore, M., Chuah, C. N., & Mukherjee, B. (2014). Disaster-survivable cloud-network mapping. Photonic network communications, 27(3), 141-153.
Donovan, S., & Feamster, N. (2014, August). NetAssay: Providing new monitoring primitives for network operators. In ACM SIGCOMM Computer Communication Review (Vol. 44, No. 4, pp. 345-346). ACM.
Durumeric, Z., Ma, Z., Springall, D., Barnes, R., Sullivan, N., Bursztein, E., … & Paxson, V. (2017). The security impact of HTTPS interception. In Network and Distributed Systems Symposium (NDSS 2017).
Greenly, D., Duncan, M., Wysack, J., & Flores, F. (2015). Space Situational Awareness Data Processing Scalability Utilizing Google Cloud Services. In Advanced Maui Optical and Space Surveillance Technologies Conference.
Guesmi, R., Farah, M. A. B., Kachouri, A., & Samet, M. (2016). A novel chaos-based image encryption using DNA sequence operation and Secure Hash Algorithm SHA-2. Nonlinear Dynamics, 83(3), 1123-1136.
Gupta, S., & Gupta, B. B. (2017). Cross-Site Scripting (XSS) attacks and defense mechanisms: classification and state-of-the-art. International Journal of System Assurance Engineering and Management, 8(1), 512-530.
Krishnan, S. P. T., & Gonzalez, J. L. U. (2015). Google Cloud SQL. In Building Your Next Big Thing with Google Cloud Platform (pp. 159-183). Apress.
Olsson, M. (2016). Using PHP. In PHP 7 Quick Scripting Reference (pp. 1-4). Apress.
Wang, B., Zheng, Y., Lou, W., & Hou, Y. T. (2015). DDoS attack protection in the era of cloud computing and software-defined networking. Computer Networks, 81, 308-319.
Turn in your highest-quality paper
Get a qualified writer to help you with
“ Complete IT Solution For Verbania’s Social Networking Web And Mobile Application ”
Get high-quality paper
NEW! AI matching with writer