HDR means a better range and sharpness of colour on your screen, so instead of perhaps seeing one uniform type grey colour as an example you would see many levels of luminosity of greys leading to a more “realistic” image.
Enabling it for each port just means it isn’t a system wide setting but each input port/HDMI must be individually enabled. It is intensive on resources but modern TVs should cope with it.
Enhanced format relates to mostly 2K or 4K (also known as UHD). As there is a greater data usage, the port that handles an Enhanced format input must be turned on eg PS4 4K display on a TV, this allows the port to transfer a much larger amount of data (I think it is 18 Gigabits per second rate or shown as 18Gbps). It should only be set to on when using such a 4K capable stream or device.
HDMI port 2 or 3 on the input section of the TV means the ports that are numbered 2 or 3. If the TV has say 4 HDMI ports there will be set numbering of them from 1 to 4. You need to check the manual, or the ports themselves sometimes are also labelled, to find out what number they each are. In the case you have noted the only ports that support 4K/UHD/HDR inputs are ports 2 and 3, so if you want to use 4K/UHD/HDR you can only plug the 4K/UHD/HDR supplying device into ports 2 or 3 and then that port must be enabled to support that data stream for 4K/UHD/HDR. If you plug into HDMI port 1 or 4 as per my example of a 4 HDMI port TV you will never be able to see the 4K/UHD/HDR output as those ports are only designed to accept Standard Definition Input.
Artefacts are faults in an image, as an example you have a picture/photo that looks pretty good from a distance but as you zoom in you get big squares/dots that are not quite right, they make the edges of objects look blurry (because they are). These errors are basically what are called artefacts particularly when they are obvious eg getting sudden squares appearing onscreen. They are caused because the image is processed and errors are created. You may hear gamers talking about artefacts or tearing when they are playing games as the image processor (graphics display hardware/engine) tries to keep displaying as many frames (images) per second as it can…this means it doesn’t always produce a clean crisp image and the result is artefacts or image errors. Blurring is where the graphics processor and the pixel refresh rate can’t keep up with the data so you get a hazy image or ghosting where you see the ghost of the previous images still on the screen (tearing as in tearing up paper). The screen must also be capable of switching the individual pixels that make up an image on and off quickly and those coloured spots must quickly return to blank when off, if they remain lit for too long (don’t blank fast enough) before they have to come on again you get the blurring/tearing. Gamers prefer below 5ms (5 milliseconds) for a pixel to go GtG (Grey to Grey). Grey to Grey just means that measurement of the speed the pixel that as example in an image creates red must go from grey (off) to red (on) to grey (off) and that speed is measured. 5ms or faster means that we don’t really perceive the change as blurring as our eyes refresh at a rate of about 10ms I think (I could be wrong on that eye refresh speed). So frame rate (number of frames per second) and the refresh rate of the pixels determine the lack of or the amount of blurring and artefacts you see.
The Sport 576 motion artefacts may be a way the image can be processed ie is dealt with to help reduce this tearing and blurring…but no idea sorry. There is some error correction processors can do to improve the image and avoid these artefacts being noticed. It is often used to help improve Sports motion capture, MRI image capture and similar tasks where artefacts would impact the result.