New features include the ability to change the youngster's settings remotely to block them from carrying out searches, and to prevent strangers from seeing their posts.
Children can still override these limitations but not without their parents being told.
The action comes a fortnight after BBC Panorama raised safety concerns.
The documentary highlighted how predators have abused the platform's recommendation engine to target some of its youngest users.
It also flagged a case in which the app's moderators did not ban a user who had been reported for sending sexual messages to an account that appeared to belong to a 14-year old girl, but was actually controlled by the programme.
TikTok has denied it was prompted into action by Panorama and said it was constantly working on new security measures.
The new features relate to TikTok's Family Pairing facility.
- This is activated by:
- Going to Settings
- Choosing Family Pairing
- Scanning a QR barcode to identify which is the adult's device and which the child's
The function was launched earlier this year, but until now was limited to placing limits on the types of content the child sees, restricting their use of private chats, and limiting how much time they spend in the app.
This is now being extended to let parents alter their child's account to:
prevent comments being posted to their videos, or only accepting comments from their friends
turn off the search function for content, users, hashtags and/or sounds
limit who can see videos the child has liked
"Today's announcement is just the latest in the steps TikTok has taken this year to keep younger users safe on the platform, including restricting direct messaging to over-16s and prompting all users under-18 to set their account to private when they join," a press release said.
TikTok has retained an option that allows children to "unpair" their device from the parent's.
Doing so sends the adult an alert and gives them 48 hours to restore the link before the child can turn off the restrictions.
Child safety experts support this decision as it allows a balance to be struck between safety and surveillance. The theory is that if teenagers view safety measures as being excessive, it might make them less likely to ask for help if they get into trouble.
One leading charity, however, said the government still needed to ensure TikTok and other apps were held to account if other safety issues arise.
"This feature is a step in the right direction, giving parents extra options to safely tailor social media to what is appropriate for their children," the NSPCC's Andy Burrows told the BBC.
"While this is a useful tool for other platforms to follow, it's clear that the only way to make social networks safer across the board is through regulation that holds tech firms accountable for failing to protect children."
The NSPCC and others are concerned that the government's proposed Online Harms Bill may not come into effect until 2023 or 2024.