Serve same website under two URLs / domains with certbot

forumacct nginx-forum at forum.nginx.org
Sat Jun 5 08:28:06 UTC 2021


Hello All,

Using nginx/1.14.2 on Linux rpi3 5.10.17-v7+. Historically I ended up with
two domains. I started with a dyndns 'domain' (is that actually correct to
call it domain?) to operate my homemade RPI weather station. Later I got a
domain from 'hover.com' to run my astronomy hobby  webpage. Both pages are
on the same RPI on the same directory branch and actually the domains manes
are interchangeable. You can see my weather station at:

http://drgert.dyndns.ws:8000/rpi/rpi.html
http://www.skywatcher.space/rpi/rpi_wetter/rpi_wetter.php

So far I had one default config:
ls -l /etc/nginx/sites-enabled
lrwxrwxrwx 1 root root 26 Jun  4 20:12 default ->
../sites-available/default

Content:
server {
	listen 80 default_server;
        listen 8000; # Alternate http port
        root /media/usbstick/nginx/www;
	# Add index.php to the list if you are using PHP
	index index.php index.html index.htm;
	server_name localhost;
	location / {
		try_files $uri $uri/ =404;
	}

	# pass PHP scripts to FastCGI server
	location ~ \.php$ {
		include snippets/fastcgi-php.conf;
		fastcgi_pass unix:/run/php/php7.3-fpm.sock;
	}
}

Now I want to use certbot for https.
But that requires certificates unique for each domain. (I think)

So I tried deleting the default file and have two conf files in conf.d for
each domain.

vi /etc/nginx/conf.d/www.skywatcher.space.conf 
server {
    listen 80 default_server;
    listen [::]:80 default_server;
    root /media/usbstick/nginx/www;
    server_name skywatcher.space www.skywatcher.space;
}

vi /etc/nginx/conf.d/www.drgert.dyndns.ws.conf
server {
        listen 80 default_server;
        listen 8000; # Alternate http port
        root /media/usbstick/nginx/www;
        # Add index.php to the list if you are using PHP
        index index.php index.html index.htm;
        server_name drgert.dyndns.ws www.drgert.dyndns.ws;
        location / {
                try_files $uri $uri/ =404;
        }
        # pass PHP scripts to FastCGI server
        location ~ \.php$ {
                include snippets/fastcgi-php.conf;
                fastcgi_pass unix:/run/php/php7.3-fpm.sock;
        }
}

But then each such file had 'default_server' keyword in it and it failed to
work.
Also I don't know if its OK that both try to listen on port 80.

So how do I do this right?

Both domains pointing to the same html files root and both should receive
https certificates. :-)

Cheers,
Gert

Posted at Nginx Forum: https://forum.nginx.org/read.php?2,291774,291774#msg-291774



More information about the nginx mailing list