I am aware that any data coming into a server from a client should be handled safely (as in sanitized, whitelisted, etc.) but I had slapped together a very simple system that pulled file contents getting the file name from the query string all on the client side in JavaScript so that the processing of the query string is actually done client side instead of server side.
Essentially what I have at the moment is this:
function loadPage(){
var urlParams = new URLSearchParams(window.location.search);
var page = urlParams.get('page');
if(page == '' || page == null){
page = 'home';
}
var jFile = 'assets/pages/' + encodeURIComponent(page) + '.js';
if(fileExists(jFile)){
if(fileExists(jFile)){
$.getScript(jFile, function loadReturn(data){
loadComplete();
});
}
else{
console.log('404');
}
}
function fileExists(file) {
var xhr = new XMLHttpRequest();
xhr.open('HEAD', file, false);
xhr.send();
if (xhr.status == "404") {
return false;
} else {
return true;
}
}
I understand the concern/risk on server side with things like PHP especially when dealing with databases so not sanitizing this seems like a bad idea, but on the other end I feel like if this was a security risk, anyone could just make the simple call anyways in their JavaScript console to whatever file they would try and exploit with this.
As far as the why, my goal was to make an extremely lightweight CMS of sorts and if this is just a terrible idea, security-wise I can build out a more complex system but if that is the case, I would be interested in the how/why it is a bad idea as well as possibly any client side solutions. The preference would be to not have any server side scripting on this.